Nov 26 13:30:31 crc systemd[1]: Starting Kubernetes Kubelet... Nov 26 13:30:32 crc kubenswrapper[5200]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 13:30:32 crc kubenswrapper[5200]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 26 13:30:32 crc kubenswrapper[5200]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 13:30:32 crc kubenswrapper[5200]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 13:30:32 crc kubenswrapper[5200]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Nov 26 13:30:32 crc kubenswrapper[5200]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.314992 5200 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324523 5200 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324547 5200 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324554 5200 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324559 5200 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324564 5200 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324569 5200 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324573 5200 feature_gate.go:328] unrecognized feature gate: Example2 Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324579 5200 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324585 5200 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324592 5200 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324598 5200 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324603 5200 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324608 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324612 5200 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324617 5200 feature_gate.go:328] unrecognized feature gate: SignatureStores Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324621 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324625 5200 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324630 5200 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324635 5200 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324639 5200 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324644 5200 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324649 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324655 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324660 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324665 5200 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324669 5200 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324675 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324679 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324700 5200 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324705 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324710 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324714 5200 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324718 5200 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324723 5200 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324727 5200 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324731 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324736 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324740 5200 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324745 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324750 5200 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324754 5200 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324758 5200 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324766 5200 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324773 5200 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324777 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324781 5200 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324786 5200 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324790 5200 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324795 5200 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324799 5200 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324803 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324807 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324811 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324815 5200 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324820 5200 feature_gate.go:328] unrecognized feature gate: Example Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324825 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324829 5200 feature_gate.go:328] unrecognized feature gate: NewOLM Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324834 5200 feature_gate.go:328] unrecognized feature gate: DualReplica Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324838 5200 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324842 5200 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324846 5200 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324850 5200 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324854 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324858 5200 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324863 5200 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324867 5200 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324871 5200 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324875 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324886 5200 feature_gate.go:328] unrecognized feature gate: PinnedImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324891 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324895 5200 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324899 5200 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324903 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324907 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324913 5200 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324918 5200 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324923 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324927 5200 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324932 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324937 5200 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324941 5200 feature_gate.go:328] unrecognized feature gate: OVNObservability Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324945 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324952 5200 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324956 5200 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324964 5200 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.324970 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325491 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325508 5200 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325514 5200 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325518 5200 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325523 5200 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325527 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325531 5200 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325535 5200 feature_gate.go:328] unrecognized feature gate: Example2 Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325540 5200 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325544 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325549 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325554 5200 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325558 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325563 5200 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325567 5200 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325582 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325587 5200 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325591 5200 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325595 5200 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325599 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325605 5200 feature_gate.go:328] unrecognized feature gate: SignatureStores Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325609 5200 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325614 5200 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325618 5200 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325622 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325627 5200 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325631 5200 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325635 5200 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325641 5200 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325645 5200 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325649 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325654 5200 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325658 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325663 5200 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325674 5200 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325679 5200 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325700 5200 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325705 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325709 5200 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325714 5200 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325718 5200 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325723 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325727 5200 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325731 5200 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325736 5200 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325740 5200 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325745 5200 feature_gate.go:328] unrecognized feature gate: OVNObservability Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325749 5200 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325753 5200 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325758 5200 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325763 5200 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325767 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325772 5200 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325776 5200 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325780 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325784 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325788 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325793 5200 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325797 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325801 5200 feature_gate.go:328] unrecognized feature gate: DualReplica Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325806 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325810 5200 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325815 5200 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325819 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325823 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325827 5200 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325831 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325845 5200 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325849 5200 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325855 5200 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325862 5200 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325866 5200 feature_gate.go:328] unrecognized feature gate: PinnedImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325873 5200 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325878 5200 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325882 5200 feature_gate.go:328] unrecognized feature gate: NewOLM Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325886 5200 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325891 5200 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325895 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325900 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325904 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325908 5200 feature_gate.go:328] unrecognized feature gate: Example Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325912 5200 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325917 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325921 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325925 5200 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.325929 5200 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.326988 5200 flags.go:64] FLAG: --address="0.0.0.0" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327003 5200 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327020 5200 flags.go:64] FLAG: --anonymous-auth="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327028 5200 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327035 5200 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327040 5200 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327047 5200 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327055 5200 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327061 5200 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327066 5200 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327071 5200 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327076 5200 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327081 5200 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327086 5200 flags.go:64] FLAG: --cgroup-root="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327099 5200 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327104 5200 flags.go:64] FLAG: --client-ca-file="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327110 5200 flags.go:64] FLAG: --cloud-config="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327114 5200 flags.go:64] FLAG: --cloud-provider="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327119 5200 flags.go:64] FLAG: --cluster-dns="[]" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327128 5200 flags.go:64] FLAG: --cluster-domain="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327135 5200 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327140 5200 flags.go:64] FLAG: --config-dir="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327144 5200 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327150 5200 flags.go:64] FLAG: --container-log-max-files="5" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327179 5200 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327185 5200 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327189 5200 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327194 5200 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327199 5200 flags.go:64] FLAG: --contention-profiling="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327204 5200 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327209 5200 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327213 5200 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327218 5200 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327225 5200 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327231 5200 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327236 5200 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327242 5200 flags.go:64] FLAG: --enable-load-reader="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327246 5200 flags.go:64] FLAG: --enable-server="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327257 5200 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327269 5200 flags.go:64] FLAG: --event-burst="100" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327274 5200 flags.go:64] FLAG: --event-qps="50" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327279 5200 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327284 5200 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327292 5200 flags.go:64] FLAG: --eviction-hard="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327298 5200 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327303 5200 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327308 5200 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327320 5200 flags.go:64] FLAG: --eviction-soft="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327325 5200 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327330 5200 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327335 5200 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327340 5200 flags.go:64] FLAG: --experimental-mounter-path="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327345 5200 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327349 5200 flags.go:64] FLAG: --fail-swap-on="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327354 5200 flags.go:64] FLAG: --feature-gates="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327360 5200 flags.go:64] FLAG: --file-check-frequency="20s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327365 5200 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327371 5200 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327376 5200 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327380 5200 flags.go:64] FLAG: --healthz-port="10248" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327386 5200 flags.go:64] FLAG: --help="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327390 5200 flags.go:64] FLAG: --hostname-override="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327395 5200 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327400 5200 flags.go:64] FLAG: --http-check-frequency="20s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327405 5200 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327409 5200 flags.go:64] FLAG: --image-credential-provider-config="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327414 5200 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327419 5200 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327424 5200 flags.go:64] FLAG: --image-service-endpoint="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327428 5200 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327436 5200 flags.go:64] FLAG: --kube-api-burst="100" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327441 5200 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327446 5200 flags.go:64] FLAG: --kube-api-qps="50" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327451 5200 flags.go:64] FLAG: --kube-reserved="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327456 5200 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327462 5200 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327467 5200 flags.go:64] FLAG: --kubelet-cgroups="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327472 5200 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327477 5200 flags.go:64] FLAG: --lock-file="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327482 5200 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327494 5200 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327500 5200 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327508 5200 flags.go:64] FLAG: --log-json-split-stream="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327513 5200 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327517 5200 flags.go:64] FLAG: --log-text-split-stream="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327522 5200 flags.go:64] FLAG: --logging-format="text" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327527 5200 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327533 5200 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327537 5200 flags.go:64] FLAG: --manifest-url="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327542 5200 flags.go:64] FLAG: --manifest-url-header="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327549 5200 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327554 5200 flags.go:64] FLAG: --max-open-files="1000000" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327561 5200 flags.go:64] FLAG: --max-pods="110" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327566 5200 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327570 5200 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327575 5200 flags.go:64] FLAG: --memory-manager-policy="None" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327580 5200 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327585 5200 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327590 5200 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327595 5200 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhel" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327607 5200 flags.go:64] FLAG: --node-status-max-images="50" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327612 5200 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327619 5200 flags.go:64] FLAG: --oom-score-adj="-999" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327625 5200 flags.go:64] FLAG: --pod-cidr="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327629 5200 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc2b30e70040205c2536d01ae5c850be1ed2d775cf13249e50328e5085777977" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327638 5200 flags.go:64] FLAG: --pod-manifest-path="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327642 5200 flags.go:64] FLAG: --pod-max-pids="-1" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327648 5200 flags.go:64] FLAG: --pods-per-core="0" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327655 5200 flags.go:64] FLAG: --port="10250" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327660 5200 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327665 5200 flags.go:64] FLAG: --provider-id="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327670 5200 flags.go:64] FLAG: --qos-reserved="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327674 5200 flags.go:64] FLAG: --read-only-port="10255" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327716 5200 flags.go:64] FLAG: --register-node="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327724 5200 flags.go:64] FLAG: --register-schedulable="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327728 5200 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327738 5200 flags.go:64] FLAG: --registry-burst="10" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327743 5200 flags.go:64] FLAG: --registry-qps="5" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327748 5200 flags.go:64] FLAG: --reserved-cpus="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327752 5200 flags.go:64] FLAG: --reserved-memory="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327758 5200 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327762 5200 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327768 5200 flags.go:64] FLAG: --rotate-certificates="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327772 5200 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327777 5200 flags.go:64] FLAG: --runonce="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327782 5200 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327787 5200 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327792 5200 flags.go:64] FLAG: --seccomp-default="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327797 5200 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327802 5200 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327807 5200 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327811 5200 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327817 5200 flags.go:64] FLAG: --storage-driver-password="root" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327822 5200 flags.go:64] FLAG: --storage-driver-secure="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327829 5200 flags.go:64] FLAG: --storage-driver-table="stats" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327834 5200 flags.go:64] FLAG: --storage-driver-user="root" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327839 5200 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327844 5200 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327849 5200 flags.go:64] FLAG: --system-cgroups="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327854 5200 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327865 5200 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327869 5200 flags.go:64] FLAG: --tls-cert-file="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327874 5200 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327892 5200 flags.go:64] FLAG: --tls-min-version="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327896 5200 flags.go:64] FLAG: --tls-private-key-file="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327901 5200 flags.go:64] FLAG: --topology-manager-policy="none" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327912 5200 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327917 5200 flags.go:64] FLAG: --topology-manager-scope="container" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327922 5200 flags.go:64] FLAG: --v="2" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327929 5200 flags.go:64] FLAG: --version="false" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327936 5200 flags.go:64] FLAG: --vmodule="" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327942 5200 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.327948 5200 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328101 5200 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328109 5200 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328114 5200 feature_gate.go:328] unrecognized feature gate: Example Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328118 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328122 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328151 5200 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328157 5200 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328163 5200 feature_gate.go:328] unrecognized feature gate: SignatureStores Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328168 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328173 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328178 5200 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328182 5200 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328187 5200 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328195 5200 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328200 5200 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328204 5200 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328208 5200 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328212 5200 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328217 5200 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328224 5200 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328228 5200 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328232 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328237 5200 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328241 5200 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328246 5200 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328250 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328263 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328268 5200 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328272 5200 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328276 5200 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328280 5200 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328286 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328290 5200 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328295 5200 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328299 5200 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328304 5200 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328309 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328316 5200 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328322 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328326 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328331 5200 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328336 5200 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328341 5200 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328345 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328350 5200 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328355 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328360 5200 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328364 5200 feature_gate.go:328] unrecognized feature gate: Example2 Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328369 5200 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328373 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328377 5200 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328384 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328389 5200 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328394 5200 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328398 5200 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328403 5200 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328407 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328411 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328416 5200 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328430 5200 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328435 5200 feature_gate.go:328] unrecognized feature gate: PinnedImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328439 5200 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328444 5200 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328448 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328452 5200 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328456 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328461 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328465 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328470 5200 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328474 5200 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328478 5200 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328482 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328487 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328491 5200 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328496 5200 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328500 5200 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328504 5200 feature_gate.go:328] unrecognized feature gate: DualReplica Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328508 5200 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328513 5200 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328517 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328521 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328526 5200 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328530 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328536 5200 feature_gate.go:328] unrecognized feature gate: NewOLM Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328541 5200 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.328546 5200 feature_gate.go:328] unrecognized feature gate: OVNObservability Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.328559 5200 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.352271 5200 server.go:530] "Kubelet version" kubeletVersion="v1.33.5" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.352321 5200 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352438 5200 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352450 5200 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352458 5200 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352467 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352475 5200 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352483 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352490 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352497 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352504 5200 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352512 5200 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352519 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352527 5200 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352535 5200 feature_gate.go:328] unrecognized feature gate: NewOLM Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352542 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352550 5200 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352557 5200 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352564 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352574 5200 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352584 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352592 5200 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352600 5200 feature_gate.go:328] unrecognized feature gate: DualReplica Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352607 5200 feature_gate.go:328] unrecognized feature gate: PinnedImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352614 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352621 5200 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352629 5200 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352636 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352643 5200 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352650 5200 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352657 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352665 5200 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352673 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352681 5200 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352720 5200 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352728 5200 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352735 5200 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352743 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352750 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352757 5200 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352764 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352771 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352780 5200 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352788 5200 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352795 5200 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352820 5200 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352827 5200 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352835 5200 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352842 5200 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352852 5200 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352863 5200 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352871 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352878 5200 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352886 5200 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352893 5200 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352900 5200 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352908 5200 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352915 5200 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352923 5200 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352931 5200 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352938 5200 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352945 5200 feature_gate.go:328] unrecognized feature gate: OVNObservability Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352952 5200 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352960 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352967 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352974 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352981 5200 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352988 5200 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.352996 5200 feature_gate.go:328] unrecognized feature gate: Example Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353003 5200 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353010 5200 feature_gate.go:328] unrecognized feature gate: SignatureStores Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353017 5200 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353024 5200 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353033 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353040 5200 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353047 5200 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353054 5200 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353062 5200 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353082 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353090 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353097 5200 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353104 5200 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353111 5200 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353118 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353126 5200 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353133 5200 feature_gate.go:328] unrecognized feature gate: Example2 Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353140 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353147 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.353159 5200 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353368 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353381 5200 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353389 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353396 5200 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353404 5200 feature_gate.go:328] unrecognized feature gate: DualReplica Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353412 5200 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353419 5200 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353428 5200 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353435 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353444 5200 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353451 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353458 5200 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353465 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353472 5200 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353479 5200 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353487 5200 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353494 5200 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353501 5200 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353509 5200 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353516 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353523 5200 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353530 5200 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353551 5200 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353558 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353565 5200 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353572 5200 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353580 5200 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353587 5200 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353594 5200 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353601 5200 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353608 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353615 5200 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353622 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353629 5200 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353636 5200 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353643 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353651 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353658 5200 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353665 5200 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353673 5200 feature_gate.go:328] unrecognized feature gate: NewOLM Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353680 5200 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353712 5200 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353721 5200 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353728 5200 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353735 5200 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353742 5200 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353752 5200 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353762 5200 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353769 5200 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353777 5200 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353784 5200 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353792 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353799 5200 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353806 5200 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353813 5200 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353835 5200 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353843 5200 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353850 5200 feature_gate.go:328] unrecognized feature gate: SignatureStores Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353857 5200 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353864 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353871 5200 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353878 5200 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353886 5200 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353893 5200 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353900 5200 feature_gate.go:328] unrecognized feature gate: PinnedImages Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353907 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353914 5200 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353922 5200 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353929 5200 feature_gate.go:328] unrecognized feature gate: Example Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353936 5200 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353946 5200 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353955 5200 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353972 5200 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353988 5200 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.353997 5200 feature_gate.go:328] unrecognized feature gate: Example2 Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354006 5200 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354015 5200 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354024 5200 feature_gate.go:328] unrecognized feature gate: OVNObservability Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354038 5200 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354049 5200 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354059 5200 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354069 5200 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354078 5200 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354087 5200 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354095 5200 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 13:30:32 crc kubenswrapper[5200]: W1126 13:30:32.354105 5200 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.354121 5200 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.355014 5200 server.go:962] "Client rotation is on, will bootstrap in background" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.385831 5200 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.385956 5200 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.388325 5200 server.go:1019] "Starting client certificate rotation" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.388547 5200 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.389627 5200 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2025-12-03 08:27:53 +0000 UTC" deadline="2025-11-24 21:40:11.458131381 +0000 UTC" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.389717 5200 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.487901 5200 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.509673 5200 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.514303 5200 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.549280 5200 log.go:25] "Validated CRI v1 runtime API" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.687834 5200 log.go:25] "Validated CRI v1 image API" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.690378 5200 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.698344 5200 fs.go:135] Filesystem UUIDs: map[19e76f87-96b8-4794-9744-0b33dca22d5b:/dev/vda3 2025-11-26-13-23-27-00:/dev/sr0 5eb7c122-420e-4494-80ec-41664070d7b6:/dev/vda4 7B77-95E7:/dev/vda2] Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.698396 5200 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:44 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.722099 5200 manager.go:217] Machine: {Timestamp:2025-11-26 13:30:32.717538086 +0000 UTC m=+1.396739974 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33649934336 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:80bc4fba336e4ca1bc9d28a8be52a356 SystemUUID:ff7ef8e9-7d9f-4686-9878-860adef1fa54 BootID:bf55aba6-4062-43be-a58f-da4b1f8292ae Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6729990144 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3364990976 Type:vfs Inodes:821531 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:44 Capacity:1073741824 Type:vfs Inodes:4107658 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16824967168 Type:vfs Inodes:4107658 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6545408 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16824967168 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9a:4b:42 Speed:0 Mtu:1500} {Name:br-int MacAddress:b2:a9:9f:57:07:84 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9a:4b:42 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c3:21:e8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:8d:4e:3c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8a:8d:7c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c6:5b:91 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:fa:56:87 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:42:5e:f4:f3:c7:95 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:96:be:6d:9f:26:5f Speed:0 Mtu:1500} {Name:tap0 MacAddress:5a:94:ef:e4:0c:ee Speed:10 Mtu:1500}] Topology:[{Id:0 Memory:33649934336 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.722521 5200 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.722812 5200 manager.go:233] Version: {KernelVersion:5.14.0-570.57.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20251021-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.725012 5200 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.725057 5200 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.725324 5200 topology_manager.go:138] "Creating topology manager with none policy" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.725340 5200 container_manager_linux.go:306] "Creating device plugin manager" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.725378 5200 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.725407 5200 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.725731 5200 state_mem.go:36] "Initialized new in-memory state store" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.725982 5200 server.go:1267] "Using root directory" path="/var/lib/kubelet" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.738323 5200 kubelet.go:491] "Attempting to sync node with API server" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.738345 5200 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.738361 5200 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.738374 5200 kubelet.go:397] "Adding apiserver pod source" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.738386 5200 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.741822 5200 state_checkpoint.go:81] "State checkpoint: restored pod resource state from checkpoint" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.741839 5200 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.743433 5200 state_checkpoint.go:81] "State checkpoint: restored pod resource state from checkpoint" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.743455 5200 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.746962 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.746990 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.747089 5200 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.5-3.rhaos4.20.gitd0ea985.el9" apiVersion="v1" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.747352 5200 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-server-current.pem" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.749757 5200 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754329 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754376 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754390 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754404 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754418 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754431 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754444 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754457 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754474 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754500 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.754519 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.762336 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.768600 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.768648 5200 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.786590 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.832203 5200 watchdog_linux.go:99] "Systemd watchdog is not enabled" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.832288 5200 server.go:1295] "Started kubelet" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.832611 5200 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.832737 5200 server_v1.go:47] "podresources" method="list" useActivePods=true Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.832884 5200 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.833591 5200 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 26 13:30:32 crc systemd[1]: Started Kubernetes Kubelet. Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.834446 5200 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.834525 5200 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2025-12-03 08:28:17 +0000 UTC" deadline="2025-11-27 23:33:27.915609711 +0000 UTC" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.834593 5200 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="34h2m55.081020654s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.835732 5200 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.836391 5200 server.go:317] "Adding debug handlers to kubelet server" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.836608 5200 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.836625 5200 volume_manager.go:295] "The desired_state_of_world populator starts" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.836642 5200 volume_manager.go:297] "Starting Kubelet Volume Manager" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.836949 5200 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.837115 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.839225 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.843595 5200 factory.go:55] Registering systemd factory Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.843664 5200 factory.go:223] Registration of the systemd container factory successfully Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.844129 5200 factory.go:153] Registering CRI-O factory Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.844151 5200 factory.go:223] Registration of the crio container factory successfully Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.844240 5200 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.844296 5200 factory.go:103] Registering Raw factory Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.844312 5200 manager.go:1196] Started watching for new ooms in manager Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.844935 5200 manager.go:319] Starting recovery of all containers Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.845462 5200 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b91a8b58ea3f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 13:30:32.832238577 +0000 UTC m=+1.511440435,LastTimestamp:2025-11-26 13:30:32.832238577 +0000 UTC m=+1.511440435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.882960 5200 manager.go:324] Recovery completed Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.897094 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.898805 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.898873 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.898888 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.900220 5200 cpu_manager.go:222] "Starting CPU manager" policy="none" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.900238 5200 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.900289 5200 state_mem.go:36] "Initialized new in-memory state store" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.937590 5200 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.965472 5200 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.967742 5200 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.967887 5200 status_manager.go:230] "Starting to sync pod status with apiserver" Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.967986 5200 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Nov 26 13:30:32 crc kubenswrapper[5200]: I1126 13:30:32.968062 5200 kubelet.go:2451] "Starting kubelet main sync loop" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.968267 5200 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 26 13:30:32 crc kubenswrapper[5200]: E1126 13:30:32.969056 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.037922 5200 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.040825 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.055205 5200 policy_none.go:49] "None policy: Start" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.055251 5200 memory_manager.go:186] "Starting memorymanager" policy="None" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.055276 5200 state_mem.go:35] "Initializing new in-memory state store" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.069935 5200 kubelet.go:2475] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.138478 5200 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211195 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211251 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211265 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211274 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211283 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211291 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211299 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211308 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211320 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211328 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211336 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211344 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211353 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211366 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ee8fbd3-1f81-4666-96da-5afc70819f1a" volumeName="kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211376 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211384 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7e2c886-118e-43bb-bef1-c78134de392b" volumeName="kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211398 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211406 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" volumeName="kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211415 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a208c9c2-333b-4b4a-be0d-bc32ec38a821" volumeName="kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211423 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211434 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211446 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211459 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211467 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211476 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" volumeName="kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211484 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211493 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211519 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211531 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211550 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211559 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211569 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211587 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211596 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211605 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211614 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211622 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211637 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211645 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" volumeName="kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211653 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" volumeName="kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211662 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211670 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211678 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211701 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211753 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20c5c5b4bed930554494851fe3cb2b2a" volumeName="kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211774 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ebfebf6-3ecd-458e-943f-bb25b52e2718" volumeName="kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211782 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ee8fbd3-1f81-4666-96da-5afc70819f1a" volumeName="kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211791 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211809 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211817 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211825 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211833 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211840 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211853 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211861 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211869 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211885 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211892 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211901 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211918 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b87002-b798-480a-8e17-83053d698239" volumeName="kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211932 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211946 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211974 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211983 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.211991 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212000 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212008 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212016 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212024 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212036 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212045 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212053 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212062 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212072 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212081 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212091 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212100 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212118 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212127 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" volumeName="kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212137 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212151 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212167 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212197 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212207 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212217 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212241 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212251 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212266 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f863fff9-286a-45fa-b8f0-8a86994b8440" volumeName="kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212275 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212285 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212295 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a208c9c2-333b-4b4a-be0d-bc32ec38a821" volumeName="kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212308 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212331 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212345 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212354 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212362 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212370 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212378 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212394 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ebfebf6-3ecd-458e-943f-bb25b52e2718" volumeName="kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212408 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34177974-8d82-49d2-a763-391d0df3bbd8" volumeName="kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212416 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212428 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212436 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212449 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212457 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212468 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212476 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212484 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" volumeName="kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212492 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212503 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212517 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" volumeName="kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212525 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42a11a02-47e1-488f-b270-2679d3298b0e" volumeName="kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212548 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212557 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212571 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212952 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.212966 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" volumeName="kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213007 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213016 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213051 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213061 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7e2c886-118e-43bb-bef1-c78134de392b" volumeName="kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213181 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213191 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213233 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af41de71-79cf-4590-bbe9-9e8b848862cb" volumeName="kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213246 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213274 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213286 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b638b8f4bb0070e40528db779baf6a2" volumeName="kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213295 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213304 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213316 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213326 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213376 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213386 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213423 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213433 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213467 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213524 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213535 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213584 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213629 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.213643 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.214031 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.214158 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.214221 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.214261 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.214289 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.214337 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.214374 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.214431 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220313 5200 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1264ac67579ad07e7e9003054d44fe40dd55285a4b2f7dc74e48be1aee0868a/globalmount" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220380 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" volumeName="kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220422 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220453 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220489 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220527 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220556 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220580 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220606 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220626 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0effdbcf-dd7d-404d-9d48-77536d665a5d" volumeName="kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220709 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220730 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220754 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220780 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220813 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" volumeName="kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220876 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220920 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220944 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220963 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.220986 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221004 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221039 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221064 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221082 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221112 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221131 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221174 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42a11a02-47e1-488f-b270-2679d3298b0e" volumeName="kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221193 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221213 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221237 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e093be35-bb62-4843-b2e8-094545761610" volumeName="kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221255 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221277 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221303 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221333 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221364 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221400 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221424 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221447 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221495 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221515 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221539 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221562 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221627 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221653 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221672 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221763 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221782 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221801 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221865 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221885 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221909 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221929 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221952 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.221972 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34177974-8d82-49d2-a763-391d0df3bbd8" volumeName="kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222035 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a14caf222afb62aaabdc47808b6f944" volumeName="kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222064 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222088 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222111 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222138 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222166 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222213 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222233 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222255 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222274 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222307 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222331 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222367 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222411 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222437 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222460 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222480 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222516 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222559 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222594 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222627 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222653 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222858 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222917 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222936 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.222986 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223024 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223047 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a14caf222afb62aaabdc47808b6f944" volumeName="kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223085 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223104 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223177 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223198 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223233 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223251 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223275 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223297 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223321 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223388 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223409 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223431 5200 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies" seLinuxMountContext="" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223448 5200 reconstruct.go:97] "Volume reconstruction finished" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.223461 5200 reconciler.go:26] "Reconciler: start to sync state" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.239113 5200 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.270031 5200 kubelet.go:2475] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.295741 5200 manager.go:341] "Starting Device Plugin manager" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.295825 5200 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.298825 5200 server.go:85] "Starting device plugin registration server" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.299422 5200 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.299439 5200 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.299660 5200 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.299957 5200 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.299989 5200 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.308088 5200 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.308129 5200 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.400083 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.401240 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.401278 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.401297 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.401335 5200 kubelet_node_status.go:78] "Attempting to register node" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.401994 5200 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.442253 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.602138 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.612131 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.612196 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.612210 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.612240 5200 kubelet_node_status.go:78] "Attempting to register node" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.612738 5200 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.670837 5200 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.671029 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.672193 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.672258 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.672277 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.673432 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.673676 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.673811 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.674244 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.674283 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.674300 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.674545 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.674587 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.674600 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.675024 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.675292 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.675368 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.675506 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.675529 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.675539 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.675990 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.676015 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.676025 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.676126 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.676413 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.676469 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.676887 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.676999 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.677028 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.677765 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.677791 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.677801 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.678633 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.678654 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.678664 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.679172 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.679196 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.679206 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.679508 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.679553 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.679571 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.680974 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.681040 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.681678 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.681726 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.681742 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.724741 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.739427 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.764974 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.787962 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.789779 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.801460 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.831599 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.831659 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.831827 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.831890 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.831915 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.831938 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.831961 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832139 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832173 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832195 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832215 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832237 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832259 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-kubernetes\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832282 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832322 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832344 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832365 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832384 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832403 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832421 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832441 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832462 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.832484 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.833204 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-kubernetes\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.833330 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.833337 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.833352 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.833499 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.833720 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.836345 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.933900 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.933959 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934033 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934060 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934056 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934118 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934136 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934138 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934156 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934214 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934226 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934260 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934290 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934321 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934324 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934341 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934345 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934364 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934387 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934395 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934394 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934412 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934425 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934469 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934454 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934487 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934489 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934526 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934555 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934581 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934632 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: I1126 13:30:33.934712 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:33 crc kubenswrapper[5200]: E1126 13:30:33.957045 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.013233 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.014199 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.014254 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.014265 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.014296 5200 kubelet_node_status.go:78] "Attempting to register node" node="crc" Nov 26 13:30:34 crc kubenswrapper[5200]: E1126 13:30:34.014816 5200 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.026119 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:30:34 crc kubenswrapper[5200]: E1126 13:30:34.041731 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.047251 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.066407 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.097604 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.102708 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:34 crc kubenswrapper[5200]: W1126 13:30:34.131937 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e08c320b1e9e2405e6e0107bdf7eeb4.slice/crio-1e59f501873a11a7358044a0ae0938a279c9e5a33b6da7767c9c83fcfc8bbaab WatchSource:0}: Error finding container 1e59f501873a11a7358044a0ae0938a279c9e5a33b6da7767c9c83fcfc8bbaab: Status 404 returned error can't find the container with id 1e59f501873a11a7358044a0ae0938a279c9e5a33b6da7767c9c83fcfc8bbaab Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.137934 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 13:30:34 crc kubenswrapper[5200]: W1126 13:30:34.138476 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c5c5b4bed930554494851fe3cb2b2a.slice/crio-28301b0863437c05447fb280ae8a4b6670ef88aeae2425d3cab29e8850a2aa33 WatchSource:0}: Error finding container 28301b0863437c05447fb280ae8a4b6670ef88aeae2425d3cab29e8850a2aa33: Status 404 returned error can't find the container with id 28301b0863437c05447fb280ae8a4b6670ef88aeae2425d3cab29e8850a2aa33 Nov 26 13:30:34 crc kubenswrapper[5200]: W1126 13:30:34.163267 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a14caf222afb62aaabdc47808b6f944.slice/crio-806576044a5db9421cbba9df64ad2b400176ea3c1122e74c716770dc3be12cc2 WatchSource:0}: Error finding container 806576044a5db9421cbba9df64ad2b400176ea3c1122e74c716770dc3be12cc2: Status 404 returned error can't find the container with id 806576044a5db9421cbba9df64ad2b400176ea3c1122e74c716770dc3be12cc2 Nov 26 13:30:34 crc kubenswrapper[5200]: W1126 13:30:34.167595 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b638b8f4bb0070e40528db779baf6a2.slice/crio-7726b60a814f2f896c07a1049542db50b050c675de86516f43f17aced195f40f WatchSource:0}: Error finding container 7726b60a814f2f896c07a1049542db50b050c675de86516f43f17aced195f40f: Status 404 returned error can't find the container with id 7726b60a814f2f896c07a1049542db50b050c675de86516f43f17aced195f40f Nov 26 13:30:34 crc kubenswrapper[5200]: W1126 13:30:34.168368 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0bc7fcb0822a2c13eb2d22cd8c0641.slice/crio-300076dcd335732a8a8c06f6cd669b656b729cf9e36a68ebbe332311e37da874 WatchSource:0}: Error finding container 300076dcd335732a8a8c06f6cd669b656b729cf9e36a68ebbe332311e37da874: Status 404 returned error can't find the container with id 300076dcd335732a8a8c06f6cd669b656b729cf9e36a68ebbe332311e37da874 Nov 26 13:30:34 crc kubenswrapper[5200]: E1126 13:30:34.243720 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Nov 26 13:30:34 crc kubenswrapper[5200]: E1126 13:30:34.276930 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Nov 26 13:30:34 crc kubenswrapper[5200]: E1126 13:30:34.398470 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.698949 5200 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Nov 26 13:30:34 crc kubenswrapper[5200]: E1126 13:30:34.700769 5200 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.788156 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.815377 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.816981 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.817048 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.817070 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.817111 5200 kubelet_node_status.go:78] "Attempting to register node" node="crc" Nov 26 13:30:34 crc kubenswrapper[5200]: E1126 13:30:34.817945 5200 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.975082 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"7726b60a814f2f896c07a1049542db50b050c675de86516f43f17aced195f40f"} Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.977185 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"300076dcd335732a8a8c06f6cd669b656b729cf9e36a68ebbe332311e37da874"} Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.978941 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"806576044a5db9421cbba9df64ad2b400176ea3c1122e74c716770dc3be12cc2"} Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.980371 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"28301b0863437c05447fb280ae8a4b6670ef88aeae2425d3cab29e8850a2aa33"} Nov 26 13:30:34 crc kubenswrapper[5200]: I1126 13:30:34.981751 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"1e59f501873a11a7358044a0ae0938a279c9e5a33b6da7767c9c83fcfc8bbaab"} Nov 26 13:30:35 crc kubenswrapper[5200]: I1126 13:30:35.789203 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:35 crc kubenswrapper[5200]: E1126 13:30:35.844319 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="3.2s" Nov 26 13:30:35 crc kubenswrapper[5200]: I1126 13:30:35.985832 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273"} Nov 26 13:30:35 crc kubenswrapper[5200]: I1126 13:30:35.987398 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd"} Nov 26 13:30:35 crc kubenswrapper[5200]: I1126 13:30:35.988861 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678"} Nov 26 13:30:35 crc kubenswrapper[5200]: I1126 13:30:35.990275 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f"} Nov 26 13:30:35 crc kubenswrapper[5200]: I1126 13:30:35.991669 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c"} Nov 26 13:30:36 crc kubenswrapper[5200]: E1126 13:30:36.184374 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Nov 26 13:30:36 crc kubenswrapper[5200]: E1126 13:30:36.341556 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.418901 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.420218 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.420259 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.420271 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.420299 5200 kubelet_node_status.go:78] "Attempting to register node" node="crc" Nov 26 13:30:36 crc kubenswrapper[5200]: E1126 13:30:36.420828 5200 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Nov 26 13:30:36 crc kubenswrapper[5200]: E1126 13:30:36.530810 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.788870 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:36 crc kubenswrapper[5200]: E1126 13:30:36.804728 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Nov 26 13:30:36 crc kubenswrapper[5200]: E1126 13:30:36.857833 5200 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b91a8b58ea3f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 13:30:32.832238577 +0000 UTC m=+1.511440435,LastTimestamp:2025-11-26 13:30:32.832238577 +0000 UTC m=+1.511440435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.998328 5200 generic.go:358] "Generic (PLEG): container finished" podID="0b638b8f4bb0070e40528db779baf6a2" containerID="d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273" exitCode=0 Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.998472 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerDied","Data":"d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273"} Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.998552 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.999860 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.999890 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:36 crc kubenswrapper[5200]: I1126 13:30:36.999901 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:37 crc kubenswrapper[5200]: E1126 13:30:37.000120 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.001702 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434"} Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.004094 5200 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678" exitCode=0 Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.004196 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678"} Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.004284 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.004951 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.004982 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.004993 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:37 crc kubenswrapper[5200]: E1126 13:30:37.005185 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.006657 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.007514 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.007545 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.007558 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.008136 5200 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f" exitCode=0 Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.008259 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f"} Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.008370 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.011749 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.011821 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.011857 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:37 crc kubenswrapper[5200]: E1126 13:30:37.012410 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.014575 5200 generic.go:358] "Generic (PLEG): container finished" podID="4e08c320b1e9e2405e6e0107bdf7eeb4" containerID="dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c" exitCode=0 Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.014654 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerDied","Data":"dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c"} Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.014910 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.015711 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.015736 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.015747 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:37 crc kubenswrapper[5200]: E1126 13:30:37.015940 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:37 crc kubenswrapper[5200]: I1126 13:30:37.787440 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.018383 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1"} Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.020829 5200 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0" exitCode=0 Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.020888 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0"} Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.021017 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.021869 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.021925 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.021939 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:38 crc kubenswrapper[5200]: E1126 13:30:38.022283 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.069929 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"9fa095bdb36662049f946bb69d771679aa4b828bc1a346509a6681deb74fd990"} Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.069976 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.071360 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.071410 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.071422 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:38 crc kubenswrapper[5200]: E1126 13:30:38.071650 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.072952 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"2972bf62a1ef13e7f8c6c519a94fb26031661adc9c603052a5068f56bb9c1c64"} Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.787427 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:38 crc kubenswrapper[5200]: I1126 13:30:38.863202 5200 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Nov 26 13:30:38 crc kubenswrapper[5200]: E1126 13:30:38.864279 5200 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Nov 26 13:30:39 crc kubenswrapper[5200]: E1126 13:30:39.046086 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="6.4s" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.077834 5200 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71" exitCode=0 Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.077934 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71"} Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.078148 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.079077 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.079132 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.079158 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:39 crc kubenswrapper[5200]: E1126 13:30:39.079515 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.082578 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"5b704a06a500a5cb328bb4264b9d15963d10f9fd659422f23576f24f4dfb76e9"} Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.082602 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.082613 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"8c65dab05adc289c78022bc965b740d22588cfb1841fa87f61c5e228ec9dfb2b"} Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.082748 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.083736 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.083800 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.083824 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.083735 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.083925 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.083953 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:39 crc kubenswrapper[5200]: E1126 13:30:39.084225 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:39 crc kubenswrapper[5200]: E1126 13:30:39.084211 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.621977 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.623140 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.623183 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.623194 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.623222 5200 kubelet_node_status.go:78] "Attempting to register node" node="crc" Nov 26 13:30:39 crc kubenswrapper[5200]: E1126 13:30:39.623775 5200 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Nov 26 13:30:39 crc kubenswrapper[5200]: I1126 13:30:39.788238 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:40 crc kubenswrapper[5200]: I1126 13:30:40.085027 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:40 crc kubenswrapper[5200]: I1126 13:30:40.085164 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:30:40 crc kubenswrapper[5200]: I1126 13:30:40.085598 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:40 crc kubenswrapper[5200]: I1126 13:30:40.085628 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:40 crc kubenswrapper[5200]: I1126 13:30:40.085640 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:40 crc kubenswrapper[5200]: E1126 13:30:40.085922 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:40 crc kubenswrapper[5200]: E1126 13:30:40.606992 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Nov 26 13:30:40 crc kubenswrapper[5200]: I1126 13:30:40.787468 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:41 crc kubenswrapper[5200]: I1126 13:30:41.087609 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:41 crc kubenswrapper[5200]: I1126 13:30:41.088654 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:41 crc kubenswrapper[5200]: I1126 13:30:41.088755 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:41 crc kubenswrapper[5200]: I1126 13:30:41.088777 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:41 crc kubenswrapper[5200]: E1126 13:30:41.089282 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:41 crc kubenswrapper[5200]: E1126 13:30:41.419248 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Nov 26 13:30:41 crc kubenswrapper[5200]: E1126 13:30:41.693788 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Nov 26 13:30:41 crc kubenswrapper[5200]: I1126 13:30:41.788066 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:41 crc kubenswrapper[5200]: E1126 13:30:41.995884 5200 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Nov 26 13:30:42 crc kubenswrapper[5200]: I1126 13:30:42.091167 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b"} Nov 26 13:30:42 crc kubenswrapper[5200]: I1126 13:30:42.787367 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:43 crc kubenswrapper[5200]: I1126 13:30:43.095776 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77"} Nov 26 13:30:43 crc kubenswrapper[5200]: E1126 13:30:43.308984 5200 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 13:30:43 crc kubenswrapper[5200]: I1126 13:30:43.788132 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:44 crc kubenswrapper[5200]: I1126 13:30:44.101922 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"e01a7101d5cdb794de4c9881f8b47a374f209769df4cfee65baf5519b44af722"} Nov 26 13:30:44 crc kubenswrapper[5200]: I1126 13:30:44.787567 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:45 crc kubenswrapper[5200]: E1126 13:30:45.446718 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="7s" Nov 26 13:30:45 crc kubenswrapper[5200]: I1126 13:30:45.787420 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:46 crc kubenswrapper[5200]: I1126 13:30:46.024173 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:46 crc kubenswrapper[5200]: I1126 13:30:46.025492 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:46 crc kubenswrapper[5200]: I1126 13:30:46.025555 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:46 crc kubenswrapper[5200]: I1126 13:30:46.025566 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:46 crc kubenswrapper[5200]: I1126 13:30:46.025604 5200 kubelet_node_status.go:78] "Attempting to register node" node="crc" Nov 26 13:30:46 crc kubenswrapper[5200]: E1126 13:30:46.026358 5200 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Nov 26 13:30:46 crc kubenswrapper[5200]: I1126 13:30:46.116175 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9"} Nov 26 13:30:46 crc kubenswrapper[5200]: I1126 13:30:46.787213 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:46 crc kubenswrapper[5200]: E1126 13:30:46.859572 5200 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b91a8b58ea3f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 13:30:32.832238577 +0000 UTC m=+1.511440435,LastTimestamp:2025-11-26 13:30:32.832238577 +0000 UTC m=+1.511440435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 13:30:46 crc kubenswrapper[5200]: I1126 13:30:46.942002 5200 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Nov 26 13:30:46 crc kubenswrapper[5200]: E1126 13:30:46.943854 5200 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Nov 26 13:30:47 crc kubenswrapper[5200]: I1126 13:30:47.123224 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625"} Nov 26 13:30:47 crc kubenswrapper[5200]: I1126 13:30:47.127546 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:47 crc kubenswrapper[5200]: I1126 13:30:47.128006 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"1746d826a47c21d3cc09746edaf7eeb6f95f73606620de22b96d31b9b95bec65"} Nov 26 13:30:47 crc kubenswrapper[5200]: I1126 13:30:47.128471 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:47 crc kubenswrapper[5200]: I1126 13:30:47.128595 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:47 crc kubenswrapper[5200]: I1126 13:30:47.128676 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:47 crc kubenswrapper[5200]: E1126 13:30:47.129010 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:47 crc kubenswrapper[5200]: I1126 13:30:47.788632 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:48 crc kubenswrapper[5200]: I1126 13:30:48.131970 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54"} Nov 26 13:30:48 crc kubenswrapper[5200]: I1126 13:30:48.132980 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:48 crc kubenswrapper[5200]: I1126 13:30:48.134638 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"11f704fa5ebe39b96b7117ed3cffb65abc9476a8b103aefc1d7ba13fe242d938"} Nov 26 13:30:48 crc kubenswrapper[5200]: I1126 13:30:48.134913 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:48 crc kubenswrapper[5200]: I1126 13:30:48.135568 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:48 crc kubenswrapper[5200]: I1126 13:30:48.135605 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:48 crc kubenswrapper[5200]: I1126 13:30:48.135618 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:48 crc kubenswrapper[5200]: E1126 13:30:48.135972 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:48 crc kubenswrapper[5200]: I1126 13:30:48.787867 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.139797 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376"} Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.139919 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.140730 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.140893 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.141042 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:49 crc kubenswrapper[5200]: E1126 13:30:49.141429 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.144278 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"37fa273bd86bfd2d1c2bbf3f15c2f493e929f34ac48b2cba7dd8444901250f04"} Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.144318 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"ecc0962d6051b4fc487eead10265ecd461c4e86f250faeb378c47feb0c771f28"} Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.144463 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.144952 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.144980 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:49 crc kubenswrapper[5200]: I1126 13:30:49.144990 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:49 crc kubenswrapper[5200]: E1126 13:30:49.145183 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.146636 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.147351 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.147427 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.147558 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.147647 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.147716 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.148180 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.148248 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:50 crc kubenswrapper[5200]: I1126 13:30:50.148274 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:50 crc kubenswrapper[5200]: E1126 13:30:50.148725 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:50 crc kubenswrapper[5200]: E1126 13:30:50.149063 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.148957 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.149656 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.149749 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.149771 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:51 crc kubenswrapper[5200]: E1126 13:30:51.150411 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.207897 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.208177 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.209179 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.209227 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.209241 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:51 crc kubenswrapper[5200]: E1126 13:30:51.212700 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.217647 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.675734 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.831810 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-etcd/etcd-crc" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.832187 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.833042 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.833075 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:51 crc kubenswrapper[5200]: I1126 13:30:51.833087 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:51 crc kubenswrapper[5200]: E1126 13:30:51.833464 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.151348 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.151347 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.151617 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.152192 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.152243 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.152269 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.152396 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.152496 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.152535 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:52 crc kubenswrapper[5200]: E1126 13:30:52.153036 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:52 crc kubenswrapper[5200]: E1126 13:30:52.153378 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:52 crc kubenswrapper[5200]: I1126 13:30:52.158883 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.027533 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.029194 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.029284 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.029314 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.029372 5200 kubelet_node_status.go:78] "Attempting to register node" node="crc" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.068317 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.154522 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.155584 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.155641 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:53 crc kubenswrapper[5200]: I1126 13:30:53.155662 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:53 crc kubenswrapper[5200]: E1126 13:30:53.156221 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:53 crc kubenswrapper[5200]: E1126 13:30:53.309350 5200 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.134888 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.135202 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.136289 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.136340 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.136356 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:54 crc kubenswrapper[5200]: E1126 13:30:54.136818 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.157837 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.158549 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.158703 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:54 crc kubenswrapper[5200]: I1126 13:30:54.158724 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:54 crc kubenswrapper[5200]: E1126 13:30:54.158986 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:56 crc kubenswrapper[5200]: I1126 13:30:56.068756 5200 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 13:30:56 crc kubenswrapper[5200]: I1126 13:30:56.068864 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 13:30:58 crc kubenswrapper[5200]: I1126 13:30:58.390789 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 26 13:30:58 crc kubenswrapper[5200]: I1126 13:30:58.391178 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:58 crc kubenswrapper[5200]: I1126 13:30:58.392615 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:58 crc kubenswrapper[5200]: I1126 13:30:58.392681 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:58 crc kubenswrapper[5200]: I1126 13:30:58.392736 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:58 crc kubenswrapper[5200]: E1126 13:30:58.393458 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:58 crc kubenswrapper[5200]: I1126 13:30:58.487904 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 26 13:30:59 crc kubenswrapper[5200]: I1126 13:30:59.172628 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:30:59 crc kubenswrapper[5200]: I1126 13:30:59.173878 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:30:59 crc kubenswrapper[5200]: I1126 13:30:59.173946 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:30:59 crc kubenswrapper[5200]: I1126 13:30:59.173974 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:30:59 crc kubenswrapper[5200]: E1126 13:30:59.174777 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:30:59 crc kubenswrapper[5200]: I1126 13:30:59.194612 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 26 13:30:59 crc kubenswrapper[5200]: I1126 13:30:59.820396 5200 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 26 13:31:00 crc kubenswrapper[5200]: I1126 13:31:00.175135 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:31:00 crc kubenswrapper[5200]: I1126 13:31:00.175814 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:00 crc kubenswrapper[5200]: I1126 13:31:00.175856 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:00 crc kubenswrapper[5200]: I1126 13:31:00.175868 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:00 crc kubenswrapper[5200]: E1126 13:31:00.176316 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:31:00 crc kubenswrapper[5200]: I1126 13:31:00.636182 5200 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 13:31:00 crc kubenswrapper[5200]: I1126 13:31:00.636273 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 13:31:00 crc kubenswrapper[5200]: I1126 13:31:00.643189 5200 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 13:31:00 crc kubenswrapper[5200]: I1126 13:31:00.643418 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 13:31:02 crc kubenswrapper[5200]: E1126 13:31:02.448502 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="7s" Nov 26 13:31:03 crc kubenswrapper[5200]: E1126 13:31:03.309797 5200 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 13:31:03 crc kubenswrapper[5200]: I1126 13:31:03.596628 5200 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Nov 26 13:31:03 crc kubenswrapper[5200]: I1126 13:31:03.619355 5200 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.143543 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.143961 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.145097 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.145182 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.145212 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:04 crc kubenswrapper[5200]: E1126 13:31:04.146011 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.152975 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.184944 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.184999 5200 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.185629 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.185957 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:04 crc kubenswrapper[5200]: I1126 13:31:04.186000 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:04 crc kubenswrapper[5200]: E1126 13:31:04.186596 5200 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.631729 5200 trace.go:236] Trace[1311321930]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 13:30:53.432) (total time: 12199ms): Nov 26 13:31:05 crc kubenswrapper[5200]: Trace[1311321930]: ---"Objects listed" error: 12199ms (13:31:05.631) Nov 26 13:31:05 crc kubenswrapper[5200]: Trace[1311321930]: [12.199074816s] [12.199074816s] END Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.631761 5200 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.634802 5200 trace.go:236] Trace[903386141]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 13:30:50.742) (total time: 14891ms): Nov 26 13:31:05 crc kubenswrapper[5200]: Trace[903386141]: ---"Objects listed" error: 14891ms (13:31:05.634) Nov 26 13:31:05 crc kubenswrapper[5200]: Trace[903386141]: [14.891914509s] [14.891914509s] END Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.634834 5200 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.637708 5200 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.637945 5200 trace.go:236] Trace[66051970]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 13:30:50.919) (total time: 14718ms): Nov 26 13:31:05 crc kubenswrapper[5200]: Trace[66051970]: ---"Objects listed" error: 14718ms (13:31:05.637) Nov 26 13:31:05 crc kubenswrapper[5200]: Trace[66051970]: [14.718295712s] [14.718295712s] END Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.637974 5200 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.643999 5200 trace.go:236] Trace[35322015]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 13:30:53.347) (total time: 12296ms): Nov 26 13:31:05 crc kubenswrapper[5200]: Trace[35322015]: ---"Objects listed" error: 12296ms (13:31:05.643) Nov 26 13:31:05 crc kubenswrapper[5200]: Trace[35322015]: [12.296087215s] [12.296087215s] END Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.644050 5200 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.664079 5200 kubelet_node_status.go:127] "Node was previously registered" node="crc" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.664357 5200 kubelet_node_status.go:81] "Successfully registered node" node="crc" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.665353 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.665433 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.665463 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.665502 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.665526 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:05Z","lastTransitionTime":"2025-11-26T13:31:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 26 13:31:05 crc kubenswrapper[5200]: E1126 13:31:05.687432 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.691289 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.691335 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.691346 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.691368 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.691383 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:05Z","lastTransitionTime":"2025-11-26T13:31:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 26 13:31:05 crc kubenswrapper[5200]: E1126 13:31:05.702142 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.707027 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.707083 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.707100 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.707127 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.707143 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:05Z","lastTransitionTime":"2025-11-26T13:31:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.715526 5200 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.715538 5200 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.715578 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.715617 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.715907 5200 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.715931 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 26 13:31:05 crc kubenswrapper[5200]: E1126 13:31:05.719009 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.722585 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.723019 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.723116 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.723198 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.723277 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:05Z","lastTransitionTime":"2025-11-26T13:31:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 26 13:31:05 crc kubenswrapper[5200]: E1126 13:31:05.733581 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.736244 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.737325 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.737351 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.737364 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.737384 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.737397 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:05Z","lastTransitionTime":"2025-11-26T13:31:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 26 13:31:05 crc kubenswrapper[5200]: E1126 13:31:05.748269 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:31:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:31:05 crc kubenswrapper[5200]: E1126 13:31:05.748441 5200 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.749768 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.749835 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.749850 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.749872 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.749925 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:05Z","lastTransitionTime":"2025-11-26T13:31:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.759988 5200 apiserver.go:52] "Watching apiserver" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.852014 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.852069 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.852088 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.852111 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.852129 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:05Z","lastTransitionTime":"2025-11-26T13:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.954663 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.954750 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.954771 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.954794 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:05 crc kubenswrapper[5200]: I1126 13:31:05.954812 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:05Z","lastTransitionTime":"2025-11-26T13:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.057444 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.057501 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.057519 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.057541 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.057558 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.068441 5200 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.068539 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.159615 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.159672 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.159714 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.159731 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.159742 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.262252 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.262325 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.262345 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.262371 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.262390 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.364708 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.364757 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.364768 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.364782 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.364792 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.467425 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.467499 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.467518 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.467543 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.467562 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.570231 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.570637 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.570839 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.570992 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.571129 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.673484 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.673535 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.673547 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.673565 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.673577 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.776133 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.776191 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.776204 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.776223 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.776234 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.878355 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.878396 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.878407 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.878421 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.878433 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.980490 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.980547 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.980560 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.980581 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:06 crc kubenswrapper[5200]: I1126 13:31:06.980594 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:06Z","lastTransitionTime":"2025-11-26T13:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.083200 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.083309 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.083331 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.083355 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.083375 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.185977 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.186060 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.186080 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.186106 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.186124 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.287665 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.287781 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.287803 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.287825 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.287842 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.389888 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.389933 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.389943 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.389958 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.389971 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.492713 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.492760 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.492770 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.492786 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.492796 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.595147 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.595385 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.595404 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.595427 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.595444 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.697530 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.697593 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.697606 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.697622 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.697632 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.800279 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.800331 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.800347 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.800365 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.800378 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.903001 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.903097 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.903120 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.903146 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:07 crc kubenswrapper[5200]: I1126 13:31:07.903165 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:07Z","lastTransitionTime":"2025-11-26T13:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.006514 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.006582 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.006601 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.006625 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.006643 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:08Z","lastTransitionTime":"2025-11-26T13:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.109865 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.109945 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.109961 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.109983 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.109997 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:08Z","lastTransitionTime":"2025-11-26T13:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.212975 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.213068 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.213092 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.213123 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.213142 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:08Z","lastTransitionTime":"2025-11-26T13:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.316167 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.316227 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.316237 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.316256 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.316268 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:08Z","lastTransitionTime":"2025-11-26T13:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.419246 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.419335 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.419358 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.419390 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:31:08 crc kubenswrapper[5200]: I1126 13:31:08.419413 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:31:08Z","lastTransitionTime":"2025-11-26T13:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.040721 5200 trace.go:236] Trace[967169878]: "Reflector ListAndWatch" name:pkg/kubelet/config/apiserver.go:66 (26-Nov-2025 13:31:05.760) (total time: 55279ms): Nov 26 13:32:01 crc kubenswrapper[5200]: Trace[967169878]: ---"Objects listed" error: 55279ms (13:32:01.040) Nov 26 13:32:01 crc kubenswrapper[5200]: Trace[967169878]: [55.279994111s] [55.279994111s] END Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.041269 5200 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.064152 5200 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.064290 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.069115 5200 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-crc.187b91a903680794\" is forbidden: quota.openshift.io/ClusterResourceQuota: caches not synchronized" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.187b91a903680794 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 13:30:34.138331028 +0000 UTC m=+2.817532846,LastTimestamp:2025-11-26 13:30:34.138331028 +0000 UTC m=+2.817532846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.083940 5200 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.084058 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.089194 5200 kubelet.go:2475] "Skipping pod synchronization" err="container runtime is down" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.089669 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.095732 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.095788 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.095800 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.095820 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.095833 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"[container runtime is down, container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?]"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.112909 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.112969 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.112983 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.113002 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.113013 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.121887 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.134409 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.160281 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.160358 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.160401 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.160425 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.160437 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.179683 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.191946 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.192047 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.192065 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5","openshift-network-diagnostics/network-check-target-fhkjl","openshift-network-node-identity/network-node-identity-dgvkt","openshift-network-operator/iptables-alerter-5jnd7","openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv","openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6"] Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.192821 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.194858 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.195499 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.195543 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.195559 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.195582 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.195597 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.198204 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.198316 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.198777 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.199790 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.199861 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.200733 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.202020 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.202961 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.202991 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.203093 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.203400 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.203463 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.204048 5200 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.204098 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.208141 5200 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.208271 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"env-overrides\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.208306 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"ovnkube-identity-cm\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.208499 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.208622 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.208626 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.209199 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"metrics-tls\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.209383 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-node-identity\"/\"network-node-identity-cert\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.209649 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"openshift-service-ca.crt\"" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.210087 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.214345 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.214399 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.214410 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.214425 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.214435 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.214176 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"kube-root-ca.crt\"" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.224556 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.228438 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.229267 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.229308 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.229317 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.229332 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.229343 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.237400 5200 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400464Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861264Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf55aba6-4062-43be-a58f-da4b1f8292ae\\\",\\\"systemUUID\\\":\\\"ff7ef8e9-7d9f-4686-9878-860adef1fa54\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.237527 5200 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.241999 5200 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.242387 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.242454 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.242478 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.242501 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.242517 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.243152 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d60e4b9f-fed8-4c81-882c-33ac5ef7275b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:42Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:47Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.250723 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258305 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7cps\" (UniqueName: \"kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps\") pod \"af41de71-79cf-4590-bbe9-9e8b848862cb\" (UID: \"af41de71-79cf-4590-bbe9-9e8b848862cb\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258366 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert\") pod \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\" (UID: \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258385 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258428 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258447 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258462 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258499 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258517 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258531 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg8nc\" (UniqueName: \"kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258547 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm9x7\" (UniqueName: \"kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258580 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258595 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258610 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258625 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmmzf\" (UniqueName: \"kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258659 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258676 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbmqg\" (UniqueName: \"kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258721 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258748 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258766 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258782 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258797 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258811 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258829 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258845 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258860 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258875 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258891 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258906 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258921 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258938 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258956 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.258970 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259020 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259035 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259082 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259115 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259132 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vsz9\" (UniqueName: \"kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259170 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259187 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgx6b\" (UniqueName: \"kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b\") pod \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\" (UID: \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259203 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259241 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259259 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259274 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259325 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259342 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259358 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259392 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259410 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259425 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls\") pod \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\" (UID: \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259441 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259490 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tknt7\" (UniqueName: \"kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259514 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259533 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259578 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zth6t\" (UniqueName: \"kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259600 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259642 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8ts\" (UniqueName: \"kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259666 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259723 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259755 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259796 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259816 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnxbn\" (UniqueName: \"kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259892 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zj9\" (UniqueName: \"kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259913 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259950 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259972 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dztfv\" (UniqueName: \"kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.259989 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260006 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs\") pod \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\" (UID: \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260044 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hb7m\" (UniqueName: \"kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260072 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260089 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260132 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj4qr\" (UniqueName: \"kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260194 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260220 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260242 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260285 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260306 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca\") pod \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\" (UID: \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260323 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9stx\" (UniqueName: \"kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260362 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbc2l\" (UniqueName: \"kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260380 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260396 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260434 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dmhf\" (UniqueName: \"kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260456 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260473 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260490 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260528 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260546 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260564 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26xrl\" (UniqueName: \"kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl\") pod \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\" (UID: \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260602 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4smf\" (UniqueName: \"kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf\") pod \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\" (UID: \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260620 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260638 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260675 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzt4w\" (UniqueName: \"kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w\") pod \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\" (UID: \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260719 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260757 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260778 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260795 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260812 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260830 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260848 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260867 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddlk9\" (UniqueName: \"kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260884 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260902 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260919 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260940 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260957 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nspp\" (UniqueName: \"kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260974 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.260990 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261007 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261026 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rsr\" (UniqueName: \"kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261066 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs\") pod \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\" (UID: \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261085 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261100 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261141 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261161 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261178 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261195 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqbfk\" (UniqueName: \"kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261235 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261252 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261270 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261312 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261331 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m26jq\" (UniqueName: \"kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261350 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkcf\" (UniqueName: \"kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261403 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261422 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261466 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261483 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261500 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pllx6\" (UniqueName: \"kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261541 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261559 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261576 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261618 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261637 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262098 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262132 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262185 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgrkj\" (UniqueName: \"kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj\") pod \"42a11a02-47e1-488f-b270-2679d3298b0e\" (UID: \"42a11a02-47e1-488f-b270-2679d3298b0e\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262212 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262258 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262283 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262302 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262344 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262365 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262406 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzkj\" (UniqueName: \"kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj\") pod \"0effdbcf-dd7d-404d-9d48-77536d665a5d\" (UID: \"0effdbcf-dd7d-404d-9d48-77536d665a5d\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262431 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsb9b\" (UniqueName: \"kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262452 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262492 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfp5s\" (UniqueName: \"kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262512 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262529 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262566 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262586 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262604 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfcv\" (UniqueName: \"kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262639 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262660 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262676 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262719 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262742 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262761 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262811 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262829 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262865 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262886 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftwb6\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262910 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262957 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262980 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.262998 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pskd\" (UniqueName: \"kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263042 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir\") pod \"f7e2c886-118e-43bb-bef1-c78134de392b\" (UID: \"f7e2c886-118e-43bb-bef1-c78134de392b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263060 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263078 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263119 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94l9h\" (UniqueName: \"kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263138 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263157 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263196 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94wk\" (UniqueName: \"kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263217 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z4sw\" (UniqueName: \"kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263259 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263283 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tqq\" (UniqueName: \"kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq\") pod \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\" (UID: \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263303 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263342 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263364 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lcfw\" (UniqueName: \"kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263386 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pddnv\" (UniqueName: \"kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv\") pod \"e093be35-bb62-4843-b2e8-094545761610\" (UID: \"e093be35-bb62-4843-b2e8-094545761610\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263424 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263446 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263465 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263505 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l87hs\" (UniqueName: \"kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs\") pod \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\" (UID: \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263527 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263547 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263586 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263609 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jjkz\" (UniqueName: \"kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263627 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263664 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263720 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rmnv\" (UniqueName: \"kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263743 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hckvg\" (UniqueName: \"kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263764 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert\") pod \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\" (UID: \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263806 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263835 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263876 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263898 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lgh\" (UniqueName: \"kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263920 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263962 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.263985 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264022 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264045 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264065 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264101 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264123 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6v2\" (UniqueName: \"kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264142 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264185 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264207 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjwtd\" (UniqueName: \"kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264226 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264266 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g4lr\" (UniqueName: \"kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr\") pod \"f7e2c886-118e-43bb-bef1-c78134de392b\" (UID: \"f7e2c886-118e-43bb-bef1-c78134de392b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264287 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264308 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264353 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws8zz\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264379 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264406 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls\") pod \"42a11a02-47e1-488f-b270-2679d3298b0e\" (UID: \"42a11a02-47e1-488f-b270-2679d3298b0e\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264435 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.265845 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61e52e-1d25-475f-ad1e-1cbe4e97250a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:36Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:41Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:45Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268328 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268381 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268421 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268469 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twvbl\" (UniqueName: \"kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268506 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268540 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nb9c\" (UniqueName: \"kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268570 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268619 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268666 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268763 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkdh6\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268871 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268918 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7xz2\" (UniqueName: \"kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268952 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268987 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269036 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgwk\" (UniqueName: \"kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269091 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269124 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269171 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269231 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269266 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269300 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nt2j\" (UniqueName: \"kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269344 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269377 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.269426 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.287567 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.291277 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.291635 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.292271 5200 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"openshift-kube-scheduler-crc\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.292291 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261198 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261525 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps" (OuterVolumeSpecName: "kube-api-access-d7cps") pod "af41de71-79cf-4590-bbe9-9e8b848862cb" (UID: "af41de71-79cf-4590-bbe9-9e8b848862cb"). InnerVolumeSpecName "kube-api-access-d7cps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.261771 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp" (OuterVolumeSpecName: "tmp") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264401 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.264646 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.266131 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv" (OuterVolumeSpecName: "kube-api-access-dztfv") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "kube-api-access-dztfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.297817 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.267619 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp" (OuterVolumeSpecName: "tmp") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.267813 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t" (OuterVolumeSpecName: "kube-api-access-zth6t") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "kube-api-access-zth6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.268264 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config" (OuterVolumeSpecName: "config") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.269672 5200 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.275398 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp" (OuterVolumeSpecName: "tmp") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.275895 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.275931 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.276150 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.276217 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.276937 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config" (OuterVolumeSpecName: "config") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.276961 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.277056 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:01.777035647 +0000 UTC m=+90.456237465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.277238 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca" (OuterVolumeSpecName: "client-ca") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.277455 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.277482 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s" (OuterVolumeSpecName: "kube-api-access-xfp5s") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "kube-api-access-xfp5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.277767 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj" (OuterVolumeSpecName: "kube-api-access-qgrkj") pod "42a11a02-47e1-488f-b270-2679d3298b0e" (UID: "42a11a02-47e1-488f-b270-2679d3298b0e"). InnerVolumeSpecName "kube-api-access-qgrkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.277901 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "6ee8fbd3-1f81-4666-96da-5afc70819f1a" (UID: "6ee8fbd3-1f81-4666-96da-5afc70819f1a"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.278340 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf" (OuterVolumeSpecName: "kube-api-access-6dmhf") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "kube-api-access-6dmhf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.280030 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config" (OuterVolumeSpecName: "config") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.280524 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities" (OuterVolumeSpecName: "utilities") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.280666 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.281024 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.281075 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities" (OuterVolumeSpecName: "utilities") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.281101 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.281353 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.281963 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b" (OuterVolumeSpecName: "kube-api-access-pgx6b") pod "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" (UID: "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4"). InnerVolumeSpecName "kube-api-access-pgx6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.286384 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.286608 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.286753 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.286796 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.287141 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.287579 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.287861 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.287954 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7" (OuterVolumeSpecName: "kube-api-access-tknt7") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "kube-api-access-tknt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.288079 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9" (OuterVolumeSpecName: "kube-api-access-99zj9") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "kube-api-access-99zj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.288250 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities" (OuterVolumeSpecName: "utilities") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.288273 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config" (OuterVolumeSpecName: "config") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.288353 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.288748 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn" (OuterVolumeSpecName: "kube-api-access-xnxbn") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "kube-api-access-xnxbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.289160 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.289164 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b" (OuterVolumeSpecName: "kube-api-access-zsb9b") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "kube-api-access-zsb9b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.289858 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf" (OuterVolumeSpecName: "kube-api-access-nmmzf") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "kube-api-access-nmmzf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.290649 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca" (OuterVolumeSpecName: "client-ca") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.290684 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv" (OuterVolumeSpecName: "kube-api-access-xxfcv") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "kube-api-access-xxfcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.290789 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.290936 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.291146 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.291390 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl" (OuterVolumeSpecName: "kube-api-access-26xrl") pod "a208c9c2-333b-4b4a-be0d-bc32ec38a821" (UID: "a208c9c2-333b-4b4a-be0d-bc32ec38a821"). InnerVolumeSpecName "kube-api-access-26xrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.291565 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.292144 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.292938 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd" (OuterVolumeSpecName: "kube-api-access-8pskd") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "kube-api-access-8pskd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.293132 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.293856 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.293995 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.294445 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images" (OuterVolumeSpecName: "images") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.294661 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.294871 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.295089 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq" (OuterVolumeSpecName: "kube-api-access-m26jq") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "kube-api-access-m26jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.295151 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities" (OuterVolumeSpecName: "utilities") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.295592 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp" (OuterVolumeSpecName: "tmp") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.296426 5200 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.296982 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key" (OuterVolumeSpecName: "signing-key") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.297214 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9" (OuterVolumeSpecName: "kube-api-access-ddlk9") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "kube-api-access-ddlk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.298783 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.299592 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:01.799562538 +0000 UTC m=+90.478764346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.299969 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:01.799960269 +0000 UTC m=+90.479162087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.301454 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6" (OuterVolumeSpecName: "kube-api-access-pllx6") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "kube-api-access-pllx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.301501 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.301539 5200 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-rbac-proxy-crio-crc\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.301572 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-etcd/etcd-crc" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.302918 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.302948 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.303757 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap" (OuterVolumeSpecName: "whereabouts-flatfile-configmap") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "whereabouts-flatfile-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.303765 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk" (OuterVolumeSpecName: "kube-api-access-w94wk") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "kube-api-access-w94wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.304149 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities" (OuterVolumeSpecName: "utilities") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.304261 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.304470 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd" (OuterVolumeSpecName: "kube-api-access-mjwtd") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "kube-api-access-mjwtd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.304626 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config" (OuterVolumeSpecName: "config") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.304903 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config" (OuterVolumeSpecName: "config") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.305012 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.305274 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv" (OuterVolumeSpecName: "kube-api-access-6rmnv") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "kube-api-access-6rmnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.305528 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs" (OuterVolumeSpecName: "kube-api-access-l87hs") pod "5ebfebf6-3ecd-458e-943f-bb25b52e2718" (UID: "5ebfebf6-3ecd-458e-943f-bb25b52e2718"). InnerVolumeSpecName "kube-api-access-l87hs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.309429 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.309457 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.309475 5200 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.309608 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.309635 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.309645 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:01.809620099 +0000 UTC m=+90.488821917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.309651 5200 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.309742 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:01.809719532 +0000 UTC m=+90.488921350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.320379 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.331659 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.342002 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.344388 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.344418 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.344431 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.344447 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.344457 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.353362 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.364303 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr" (OuterVolumeSpecName: "kube-api-access-wj4qr") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "kube-api-access-wj4qr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.364663 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert" (OuterVolumeSpecName: "cert") pod "a52afe44-fb37-46ed-a1f8-bf39727a3cbe" (UID: "a52afe44-fb37-46ed-a1f8-bf39727a3cbe"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.364834 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.364931 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities" (OuterVolumeSpecName: "utilities") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.365094 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.365512 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6" (OuterVolumeSpecName: "kube-api-access-ftwb6") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "kube-api-access-ftwb6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.365533 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m" (OuterVolumeSpecName: "kube-api-access-4hb7m") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "kube-api-access-4hb7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.365563 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.365739 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc" (OuterVolumeSpecName: "kube-api-access-zg8nc") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "kube-api-access-zg8nc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.365835 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg" (OuterVolumeSpecName: "kube-api-access-wbmqg") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "kube-api-access-wbmqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.366725 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config" (OuterVolumeSpecName: "config") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.365855 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.366316 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "0dd0fbac-8c0d-4228-8faa-abbeedabf7db" (UID: "0dd0fbac-8c0d-4228-8faa-abbeedabf7db"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.366343 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.366473 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.366733 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config" (OuterVolumeSpecName: "config") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367017 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h" (OuterVolumeSpecName: "kube-api-access-94l9h") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "kube-api-access-94l9h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367131 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv" (OuterVolumeSpecName: "kube-api-access-pddnv") pod "e093be35-bb62-4843-b2e8-094545761610" (UID: "e093be35-bb62-4843-b2e8-094545761610"). InnerVolumeSpecName "kube-api-access-pddnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367186 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp" (OuterVolumeSpecName: "tmp") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367286 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities" (OuterVolumeSpecName: "utilities") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367437 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367525 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367733 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367738 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367817 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.367994 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.368030 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.368038 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.368051 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6" (OuterVolumeSpecName: "kube-api-access-tkdh6") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "kube-api-access-tkdh6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.368281 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "42a11a02-47e1-488f-b270-2679d3298b0e" (UID: "42a11a02-47e1-488f-b270-2679d3298b0e"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.365420 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.369269 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.368675 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx" (OuterVolumeSpecName: "kube-api-access-l9stx") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "kube-api-access-l9stx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.369597 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.369853 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9" (OuterVolumeSpecName: "kube-api-access-9vsz9") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "kube-api-access-9vsz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.370172 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" (OuterVolumeSpecName: "kube-api-access-grwfz") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "kube-api-access-grwfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.370243 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config" (OuterVolumeSpecName: "config") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.370255 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts" (OuterVolumeSpecName: "kube-api-access-4g8ts") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "kube-api-access-4g8ts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.370378 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.370443 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config" (OuterVolumeSpecName: "config") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.370567 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "a208c9c2-333b-4b4a-be0d-bc32ec38a821" (UID: "a208c9c2-333b-4b4a-be0d-bc32ec38a821"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.370765 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf" (OuterVolumeSpecName: "kube-api-access-q4smf") pod "0dd0fbac-8c0d-4228-8faa-abbeedabf7db" (UID: "0dd0fbac-8c0d-4228-8faa-abbeedabf7db"). InnerVolumeSpecName "kube-api-access-q4smf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.370829 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.368547 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.371055 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume" (OuterVolumeSpecName: "config-volume") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.371095 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf" (OuterVolumeSpecName: "kube-api-access-ptkcf") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "kube-api-access-ptkcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.371115 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.371100 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz" (OuterVolumeSpecName: "kube-api-access-ws8zz") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "kube-api-access-ws8zz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.371374 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config" (OuterVolumeSpecName: "config") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.372441 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.372879 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373078 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373191 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: W1126 13:32:01.373312 5200 empty_dir.go:511] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/31fa8943-81cc-4750-a0b7-0fa9ab5af883/volumes/kubernetes.io~projected/kube-api-access-grwfz Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373352 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" (OuterVolumeSpecName: "kube-api-access-grwfz") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "kube-api-access-grwfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373470 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373589 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373614 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Nov 26 13:32:01 crc kubenswrapper[5200]: W1126 13:32:01.373760 5200 empty_dir.go:511] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9f71a554-e414-4bc3-96d2-674060397afe/volumes/kubernetes.io~configmap/trusted-ca Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373750 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373773 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373776 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373913 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.373996 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.374175 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq" (OuterVolumeSpecName: "kube-api-access-d4tqq") pod "6ee8fbd3-1f81-4666-96da-5afc70819f1a" (UID: "6ee8fbd3-1f81-4666-96da-5afc70819f1a"). InnerVolumeSpecName "kube-api-access-d4tqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.375809 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz" (OuterVolumeSpecName: "kube-api-access-7jjkz") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "kube-api-access-7jjkz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.376778 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w" (OuterVolumeSpecName: "kube-api-access-rzt4w") pod "a52afe44-fb37-46ed-a1f8-bf39727a3cbe" (UID: "a52afe44-fb37-46ed-a1f8-bf39727a3cbe"). InnerVolumeSpecName "kube-api-access-rzt4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.377240 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.378066 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.378302 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.378435 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.378539 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d60e4b9f-fed8-4c81-882c-33ac5ef7275b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:42Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:47Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.378789 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp" (OuterVolumeSpecName: "tmp") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.379018 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem" (OuterVolumeSpecName: "ca-trust-extracted-pem") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "ca-trust-extracted-pem". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.379161 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.379261 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380396 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380430 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380455 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380472 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnxbn\" (UniqueName: \"kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380487 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99zj9\" (UniqueName: \"kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380500 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dztfv\" (UniqueName: \"kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380519 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380533 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hb7m\" (UniqueName: \"kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380546 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380560 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.380582 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wj4qr\" (UniqueName: \"kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381021 5200 reconciler_common.go:299] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381036 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381048 5200 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381060 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9stx\" (UniqueName: \"kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381128 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381168 5200 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381180 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dmhf\" (UniqueName: \"kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381191 5200 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381201 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381173 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381212 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.382189 5200 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381354 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config" (OuterVolumeSpecName: "config") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.381774 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images" (OuterVolumeSpecName: "images") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.382056 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.382168 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config" (OuterVolumeSpecName: "config") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.382196 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities" (OuterVolumeSpecName: "utilities") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.382717 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.382827 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.382864 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw" (OuterVolumeSpecName: "kube-api-access-5lcfw") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "kube-api-access-5lcfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383279 5200 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383311 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26xrl\" (UniqueName: \"kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383362 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr" (OuterVolumeSpecName: "kube-api-access-6g4lr") pod "f7e2c886-118e-43bb-bef1-c78134de392b" (UID: "f7e2c886-118e-43bb-bef1-c78134de392b"). InnerVolumeSpecName "kube-api-access-6g4lr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383486 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4smf\" (UniqueName: \"kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383518 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383554 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzt4w\" (UniqueName: \"kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383569 5200 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383584 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383604 5200 reconciler_common.go:299] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383618 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddlk9\" (UniqueName: \"kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383631 5200 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383641 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383653 5200 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383664 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383674 5200 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383723 5200 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383737 5200 reconciler_common.go:299] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383748 5200 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383760 5200 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383771 5200 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383782 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383792 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m26jq\" (UniqueName: \"kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383802 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptkcf\" (UniqueName: \"kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383813 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383823 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383833 5200 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383844 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pllx6\" (UniqueName: \"kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383854 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383867 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383876 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383888 5200 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383899 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgrkj\" (UniqueName: \"kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383911 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383920 5200 reconciler_common.go:299] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383930 5200 reconciler_common.go:299] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383940 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsb9b\" (UniqueName: \"kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383951 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfp5s\" (UniqueName: \"kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383960 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383970 5200 reconciler_common.go:299] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383980 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.383990 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384002 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxfcv\" (UniqueName: \"kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384015 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384024 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384034 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384045 5200 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384055 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384064 5200 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384074 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftwb6\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384084 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384096 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pskd\" (UniqueName: \"kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384108 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94l9h\" (UniqueName: \"kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384123 5200 reconciler_common.go:299] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384133 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384143 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w94wk\" (UniqueName: \"kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384154 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4tqq\" (UniqueName: \"kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384170 5200 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384181 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384198 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pddnv\" (UniqueName: \"kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384208 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384218 5200 reconciler_common.go:299] "Volume detached for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384231 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l87hs\" (UniqueName: \"kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384245 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384256 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jjkz\" (UniqueName: \"kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384286 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384301 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rmnv\" (UniqueName: \"kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384312 5200 reconciler_common.go:299] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384322 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384339 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384351 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384361 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384370 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384380 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjwtd\" (UniqueName: \"kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384391 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384407 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384422 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ws8zz\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384437 5200 reconciler_common.go:299] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384448 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384458 5200 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384468 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384478 5200 reconciler_common.go:299] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384489 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384499 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkdh6\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.384517 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7cps\" (UniqueName: \"kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.387852 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" (UID: "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.388438 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.388468 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config" (OuterVolumeSpecName: "console-config") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.389004 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.389146 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca" (OuterVolumeSpecName: "service-ca") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.389528 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.389489 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.389662 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.389877 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.390117 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.390221 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.390415 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.390586 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391025 5200 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391157 5200 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391181 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391222 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391239 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391253 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391258 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp" (OuterVolumeSpecName: "kube-api-access-8nspp") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "kube-api-access-8nspp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391270 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zg8nc\" (UniqueName: \"kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391397 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391422 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmmzf\" (UniqueName: \"kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391440 5200 reconciler_common.go:299] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391452 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbmqg\" (UniqueName: \"kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391471 5200 reconciler_common.go:299] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391489 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391503 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391521 5200 reconciler_common.go:299] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391534 5200 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391545 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391555 5200 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391569 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391580 5200 reconciler_common.go:299] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391591 5200 reconciler_common.go:299] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391602 5200 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391615 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391626 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391637 5200 reconciler_common.go:299] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391648 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vsz9\" (UniqueName: \"kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391667 5200 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391769 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgx6b\" (UniqueName: \"kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391782 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391797 5200 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391808 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391821 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391833 5200 reconciler_common.go:299] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391848 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391860 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tknt7\" (UniqueName: \"kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391871 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391881 5200 reconciler_common.go:299] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391896 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zth6t\" (UniqueName: \"kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391907 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391917 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4g8ts\" (UniqueName: \"kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391935 5200 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391946 5200 reconciler_common.go:299] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391956 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391426 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391435 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391606 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391627 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.391921 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw" (OuterVolumeSpecName: "kube-api-access-9z4sw") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "kube-api-access-9z4sw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.392395 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca" (OuterVolumeSpecName: "service-ca") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.393672 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7" (OuterVolumeSpecName: "kube-api-access-hm9x7") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "kube-api-access-hm9x7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.394307 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.394907 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f7e2c886-118e-43bb-bef1-c78134de392b" (UID: "f7e2c886-118e-43bb-bef1-c78134de392b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.395078 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.395097 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.395127 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.395339 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.395357 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.395462 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.395716 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c" (OuterVolumeSpecName: "kube-api-access-8nb9c") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "kube-api-access-8nb9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.396501 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr" (OuterVolumeSpecName: "kube-api-access-z5rsr") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "kube-api-access-z5rsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.396757 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.396823 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config" (OuterVolumeSpecName: "config") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.396736 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.396857 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.397049 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.397129 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.397310 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.397296 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.397441 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.398313 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.398339 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs" (OuterVolumeSpecName: "certs") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.399351 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.398876 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.399592 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.399909 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl" (OuterVolumeSpecName: "kube-api-access-twvbl") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "kube-api-access-twvbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.399964 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.399993 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.399997 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2" (OuterVolumeSpecName: "kube-api-access-ks6v2") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "kube-api-access-ks6v2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.400211 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.400366 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg" (OuterVolumeSpecName: "kube-api-access-hckvg") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "kube-api-access-hckvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.400657 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh" (OuterVolumeSpecName: "kube-api-access-m5lgh") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "kube-api-access-m5lgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.400835 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.400805 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.400907 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config" (OuterVolumeSpecName: "config") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.401189 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca" (OuterVolumeSpecName: "serviceca") pod "5ebfebf6-3ecd-458e-943f-bb25b52e2718" (UID: "5ebfebf6-3ecd-458e-943f-bb25b52e2718"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.401319 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk" (OuterVolumeSpecName: "kube-api-access-qqbfk") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "kube-api-access-qqbfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.401370 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.402024 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit" (OuterVolumeSpecName: "audit") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.402589 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l" (OuterVolumeSpecName: "kube-api-access-sbc2l") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "kube-api-access-sbc2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.402821 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.403430 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.408074 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nt2j\" (UniqueName: \"kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.408516 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj" (OuterVolumeSpecName: "kube-api-access-mfzkj") pod "0effdbcf-dd7d-404d-9d48-77536d665a5d" (UID: "0effdbcf-dd7d-404d-9d48-77536d665a5d"). InnerVolumeSpecName "kube-api-access-mfzkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.408632 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.409470 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.410761 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61e52e-1d25-475f-ad1e-1cbe4e97250a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:36Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:41Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:45Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.415463 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.415981 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7xz2\" (UniqueName: \"kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.422842 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgwk\" (UniqueName: \"kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.425366 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35ae0687-3cb3-4330-a5f5-e4eb697ad531\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2972bf62a1ef13e7f8c6c519a94fb26031661adc9c603052a5068f56bb9c1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c65dab05adc289c78022bc965b740d22588cfb1841fa87f61c5e228ec9dfb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5b704a06a500a5cb328bb4264b9d15963d10f9fd659422f23576f24f4dfb76e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.429413 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.436563 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.447089 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.447155 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.447166 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.447189 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.447204 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.448834 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.462579 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.475276 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.485360 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed56afcc-296f-4f4a-9361-dd001f3490a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://9fa095bdb36662049f946bb69d771679aa4b828bc1a346509a6681deb74fd990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.487262 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.492961 5200 reconciler_common.go:299] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.492994 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493005 5200 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493013 5200 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493023 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbc2l\" (UniqueName: \"kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493032 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493041 5200 reconciler_common.go:299] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493050 5200 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493062 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493070 5200 reconciler_common.go:299] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493080 5200 reconciler_common.go:299] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493088 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nspp\" (UniqueName: \"kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493097 5200 reconciler_common.go:299] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493104 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5rsr\" (UniqueName: \"kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493113 5200 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493121 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493129 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqbfk\" (UniqueName: \"kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493138 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493148 5200 reconciler_common.go:299] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493156 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493165 5200 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493173 5200 reconciler_common.go:299] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493183 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493191 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493200 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493208 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493219 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfzkj\" (UniqueName: \"kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493229 5200 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493237 5200 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493246 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493262 5200 reconciler_common.go:299] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493273 5200 reconciler_common.go:299] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493282 5200 reconciler_common.go:299] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493290 5200 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493299 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493307 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493317 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9z4sw\" (UniqueName: \"kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493338 5200 reconciler_common.go:299] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493346 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lcfw\" (UniqueName: \"kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493355 5200 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493365 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493376 5200 reconciler_common.go:299] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493385 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493394 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hckvg\" (UniqueName: \"kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493402 5200 reconciler_common.go:299] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493410 5200 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493418 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5lgh\" (UniqueName: \"kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493427 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493435 5200 reconciler_common.go:299] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493443 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493451 5200 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493459 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks6v2\" (UniqueName: \"kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493467 5200 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493476 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493485 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6g4lr\" (UniqueName: \"kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493493 5200 reconciler_common.go:299] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493501 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493508 5200 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493516 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twvbl\" (UniqueName: \"kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493524 5200 reconciler_common.go:299] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493532 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nb9c\" (UniqueName: \"kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493540 5200 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493548 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hm9x7\" (UniqueName: \"kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493585 5200 reconciler_common.go:299] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493592 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493600 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493608 5200 reconciler_common.go:299] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493617 5200 reconciler_common.go:299] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493627 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493639 5200 reconciler_common.go:299] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493651 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493663 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493675 5200 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493704 5200 reconciler_common.go:299] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.493716 5200 reconciler_common.go:299] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.495742 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.506474 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.511346 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.515186 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-dgvkt" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.520841 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.534045 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.549833 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5jnd7" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.549972 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.550026 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.550037 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.550074 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.550085 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.595737 5200 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.595784 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.652594 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.652641 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.652655 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.652755 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.652781 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.758503 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.758549 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.758559 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.758583 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.758593 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.797422 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.797668 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:02.797643643 +0000 UTC m=+91.476845461 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.860892 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.860966 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.860979 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.860995 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.861009 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.898632 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.898710 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.898737 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.898772 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.898919 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.898938 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.898951 5200 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899010 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:02.898994008 +0000 UTC m=+91.578195836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899404 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899429 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899440 5200 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899475 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:02.899464742 +0000 UTC m=+91.578666580 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899527 5200 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899556 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:02.899548934 +0000 UTC m=+91.578750762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899586 5200 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: E1126 13:32:01.899611 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:02.899604275 +0000 UTC m=+91.578806103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.933353 5200 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8sxgf" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.946025 5200 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8sxgf" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.962938 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.962978 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.962987 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.963002 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:01 crc kubenswrapper[5200]: I1126 13:32:01.963011 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:01Z","lastTransitionTime":"2025-11-26T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.065545 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.066038 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.066055 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.066077 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.066091 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.104363 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" event={"ID":"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc","Type":"ContainerStarted","Data":"d4f8d133c394ce092bc5b467f2cfec438f2dba1c6605396ed1c6367c51a5b4dc"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.107368 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" event={"ID":"34177974-8d82-49d2-a763-391d0df3bbd8","Type":"ContainerStarted","Data":"a3d103b2931a07f70607a8c2706de2607d8817ddebd8315799c1a35762dca29d"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.107411 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" event={"ID":"34177974-8d82-49d2-a763-391d0df3bbd8","Type":"ContainerStarted","Data":"6e802d6ce9768353cecd922dee9a595c36d72d366c2c4009428a3518c28204e1"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.109486 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"e246394f8c4438ea4816c401e4608482b7bb58a72e7bc512f28947888981b8b5"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.109514 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"f0323d67ebefa6275a0d2fe057a7c201b02fcd101a2371f63dece547cca2e245"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.114065 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.116251 5200 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376" exitCode=255 Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.116304 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.117941 5200 scope.go:117] "RemoveContainer" containerID="71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.143657 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.170488 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.170534 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.170544 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.170560 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.170574 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.176496 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.230933 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593def59-8f2e-4a01-b78d-8c3f74e9ff11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://1746d826a47c21d3cc09746edaf7eeb6f95f73606620de22b96d31b9b95bec65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://11f704fa5ebe39b96b7117ed3cffb65abc9476a8b103aefc1d7ba13fe242d938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ecc0962d6051b4fc487eead10265ecd461c4e86f250faeb378c47feb0c771f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://37fa273bd86bfd2d1c2bbf3f15c2f493e929f34ac48b2cba7dd8444901250f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e01a7101d5cdb794de4c9881f8b47a374f209769df4cfee65baf5519b44af722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:43Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.251805 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d60e4b9f-fed8-4c81-882c-33ac5ef7275b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:42Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:47Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.271057 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61e52e-1d25-475f-ad1e-1cbe4e97250a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:36Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:41Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:45Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.273230 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.273280 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.273291 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.273312 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.273324 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.283542 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35ae0687-3cb3-4330-a5f5-e4eb697ad531\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2972bf62a1ef13e7f8c6c519a94fb26031661adc9c603052a5068f56bb9c1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c65dab05adc289c78022bc965b740d22588cfb1841fa87f61c5e228ec9dfb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5b704a06a500a5cb328bb4264b9d15963d10f9fd659422f23576f24f4dfb76e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.295951 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.308369 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.320644 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.336222 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a3d103b2931a07f70607a8c2706de2607d8817ddebd8315799c1a35762dca29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.345753 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed56afcc-296f-4f4a-9361-dd001f3490a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://9fa095bdb36662049f946bb69d771679aa4b828bc1a346509a6681deb74fd990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.355768 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.369656 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.375810 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.375925 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.375943 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.375964 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.376012 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.398544 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593def59-8f2e-4a01-b78d-8c3f74e9ff11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://1746d826a47c21d3cc09746edaf7eeb6f95f73606620de22b96d31b9b95bec65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://11f704fa5ebe39b96b7117ed3cffb65abc9476a8b103aefc1d7ba13fe242d938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ecc0962d6051b4fc487eead10265ecd461c4e86f250faeb378c47feb0c771f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://37fa273bd86bfd2d1c2bbf3f15c2f493e929f34ac48b2cba7dd8444901250f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e01a7101d5cdb794de4c9881f8b47a374f209769df4cfee65baf5519b44af722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:43Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.414934 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d60e4b9f-fed8-4c81-882c-33ac5ef7275b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:42Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"S_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 13:31:05.665766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 13:31:05.665770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 13:31:05.665773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 13:31:05.665776 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 13:31:05.665810 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1126 13:31:05.674255 1 requestheader_controller.go:180] Starting RequestHeaderAuthRequestController\\\\nI1126 13:31:05.674350 1 shared_informer.go:350] \\\\\\\"Waiting for caches to sync\\\\\\\" controller=\\\\\\\"RequestHeaderAuthRequestController\\\\\\\"\\\\nI1126 13:31:05.674448 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3700699131/tls.crt::/tmp/serving-cert-3700699131/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764163848\\\\\\\\\\\\\\\" (2025-11-26 13:30:48 +0000 UTC to 2025-11-26 13:30:49 +0000 UTC (now=2025-11-26 13:31:05.674394794 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674832 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764163859\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764163859\\\\\\\\\\\\\\\" (2025-11-26 12:30:59 +0000 UTC to 2028-11-26 12:30:59 +0000 UTC (now=2025-11-26 13:31:05.674797695 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674867 1 secure_serving.go:211] Serving securely on [::]:17697\\\\nI1126 13:31:05.674912 1 genericapiserver.go:696] [graceful-termination] waiting for shutdown to be initiated\\\\nI1126 13:31:05.675463 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1126 13:31:05.675787 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:47Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.438768 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.440206 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61e52e-1d25-475f-ad1e-1cbe4e97250a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:36Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:41Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:45Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.476120 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35ae0687-3cb3-4330-a5f5-e4eb697ad531\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2972bf62a1ef13e7f8c6c519a94fb26031661adc9c603052a5068f56bb9c1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c65dab05adc289c78022bc965b740d22588cfb1841fa87f61c5e228ec9dfb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5b704a06a500a5cb328bb4264b9d15963d10f9fd659422f23576f24f4dfb76e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.478322 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.478384 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.478400 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.478428 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.478442 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.506661 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.530725 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.581985 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.582066 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.582081 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.582121 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.582135 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.583203 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.622507 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a3d103b2931a07f70607a8c2706de2607d8817ddebd8315799c1a35762dca29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.649389 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed56afcc-296f-4f4a-9361-dd001f3490a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://9fa095bdb36662049f946bb69d771679aa4b828bc1a346509a6681deb74fd990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.684738 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.684888 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.684904 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.684924 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.684937 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.697001 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wqbhz"] Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.713514 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-d2nf9"] Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.713726 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.717702 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tk7bt\"" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.717752 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.719723 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.731770 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.741322 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-wqbhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxd2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:32:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wqbhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.764722 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593def59-8f2e-4a01-b78d-8c3f74e9ff11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://1746d826a47c21d3cc09746edaf7eeb6f95f73606620de22b96d31b9b95bec65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://11f704fa5ebe39b96b7117ed3cffb65abc9476a8b103aefc1d7ba13fe242d938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ecc0962d6051b4fc487eead10265ecd461c4e86f250faeb378c47feb0c771f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://37fa273bd86bfd2d1c2bbf3f15c2f493e929f34ac48b2cba7dd8444901250f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e01a7101d5cdb794de4c9881f8b47a374f209769df4cfee65baf5519b44af722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:43Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.778076 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d60e4b9f-fed8-4c81-882c-33ac5ef7275b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:42Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"S_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 13:31:05.665766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 13:31:05.665770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 13:31:05.665773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 13:31:05.665776 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 13:31:05.665810 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1126 13:31:05.674255 1 requestheader_controller.go:180] Starting RequestHeaderAuthRequestController\\\\nI1126 13:31:05.674350 1 shared_informer.go:350] \\\\\\\"Waiting for caches to sync\\\\\\\" controller=\\\\\\\"RequestHeaderAuthRequestController\\\\\\\"\\\\nI1126 13:31:05.674448 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3700699131/tls.crt::/tmp/serving-cert-3700699131/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764163848\\\\\\\\\\\\\\\" (2025-11-26 13:30:48 +0000 UTC to 2025-11-26 13:30:49 +0000 UTC (now=2025-11-26 13:31:05.674394794 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674832 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764163859\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764163859\\\\\\\\\\\\\\\" (2025-11-26 12:30:59 +0000 UTC to 2028-11-26 12:30:59 +0000 UTC (now=2025-11-26 13:31:05.674797695 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674867 1 secure_serving.go:211] Serving securely on [::]:17697\\\\nI1126 13:31:05.674912 1 genericapiserver.go:696] [graceful-termination] waiting for shutdown to be initiated\\\\nI1126 13:31:05.675463 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1126 13:31:05.675787 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:47Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.787868 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.787916 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.787931 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.787949 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.787961 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.795647 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61e52e-1d25-475f-ad1e-1cbe4e97250a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:36Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:41Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:45Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807030 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807162 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-tmp-dir\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807196 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2eae436-2fa1-4b90-8656-d39061de1908-proxy-tls\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807222 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxd2r\" (UniqueName: \"kubernetes.io/projected/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-kube-api-access-bxd2r\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807247 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2eae436-2fa1-4b90-8656-d39061de1908-mcd-auth-proxy-config\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807289 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-hosts-file\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807325 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2eae436-2fa1-4b90-8656-d39061de1908-rootfs\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807346 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn98b\" (UniqueName: \"kubernetes.io/projected/a2eae436-2fa1-4b90-8656-d39061de1908-kube-api-access-vn98b\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.807467 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:04.807442405 +0000 UTC m=+93.486644223 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.807804 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35ae0687-3cb3-4330-a5f5-e4eb697ad531\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2972bf62a1ef13e7f8c6c519a94fb26031661adc9c603052a5068f56bb9c1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c65dab05adc289c78022bc965b740d22588cfb1841fa87f61c5e228ec9dfb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5b704a06a500a5cb328bb4264b9d15963d10f9fd659422f23576f24f4dfb76e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.827624 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.839018 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.839840 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.840922 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.841091 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\"" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.841654 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.841665 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"proxy-tls\"" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.842244 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-w9nzh\"" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.857085 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.867871 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a3d103b2931a07f70607a8c2706de2607d8817ddebd8315799c1a35762dca29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.876751 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed56afcc-296f-4f4a-9361-dd001f3490a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://9fa095bdb36662049f946bb69d771679aa4b828bc1a346509a6681deb74fd990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.886899 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.890662 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.890736 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.890750 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.890769 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.890782 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.905720 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593def59-8f2e-4a01-b78d-8c3f74e9ff11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://1746d826a47c21d3cc09746edaf7eeb6f95f73606620de22b96d31b9b95bec65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://11f704fa5ebe39b96b7117ed3cffb65abc9476a8b103aefc1d7ba13fe242d938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ecc0962d6051b4fc487eead10265ecd461c4e86f250faeb378c47feb0c771f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://37fa273bd86bfd2d1c2bbf3f15c2f493e929f34ac48b2cba7dd8444901250f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e01a7101d5cdb794de4c9881f8b47a374f209769df4cfee65baf5519b44af722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:43Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.907796 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-hosts-file\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.907856 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.907887 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2eae436-2fa1-4b90-8656-d39061de1908-rootfs\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.907912 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn98b\" (UniqueName: \"kubernetes.io/projected/a2eae436-2fa1-4b90-8656-d39061de1908-kube-api-access-vn98b\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.907985 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2eae436-2fa1-4b90-8656-d39061de1908-rootfs\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.908059 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-hosts-file\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.908108 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.908213 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.908253 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-tmp-dir\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.908283 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2eae436-2fa1-4b90-8656-d39061de1908-proxy-tls\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.908312 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxd2r\" (UniqueName: \"kubernetes.io/projected/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-kube-api-access-bxd2r\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908329 5200 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908345 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908368 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908383 5200 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.908336 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2eae436-2fa1-4b90-8656-d39061de1908-mcd-auth-proxy-config\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908336 5200 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908422 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:04.908394379 +0000 UTC m=+93.587596257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908547 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:04.908525753 +0000 UTC m=+93.587727571 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908568 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:04.908559204 +0000 UTC m=+93.587761112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.908604 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908718 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908734 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908747 5200 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.908791 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:04.90878135 +0000 UTC m=+93.587983178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.909316 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-tmp-dir\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.909362 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2eae436-2fa1-4b90-8656-d39061de1908-mcd-auth-proxy-config\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.917064 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2eae436-2fa1-4b90-8656-d39061de1908-proxy-tls\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.925732 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxd2r\" (UniqueName: \"kubernetes.io/projected/cb07d719-2ce6-4c5e-8ef6-f3455c165ea8-kube-api-access-bxd2r\") pod \"node-resolver-wqbhz\" (UID: \"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\") " pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.927681 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn98b\" (UniqueName: \"kubernetes.io/projected/a2eae436-2fa1-4b90-8656-d39061de1908-kube-api-access-vn98b\") pod \"machine-config-daemon-d2nf9\" (UID: \"a2eae436-2fa1-4b90-8656-d39061de1908\") " pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.932968 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d60e4b9f-fed8-4c81-882c-33ac5ef7275b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:42Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"S_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 13:31:05.665766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 13:31:05.665770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 13:31:05.665773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 13:31:05.665776 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 13:31:05.665810 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1126 13:31:05.674255 1 requestheader_controller.go:180] Starting RequestHeaderAuthRequestController\\\\nI1126 13:31:05.674350 1 shared_informer.go:350] \\\\\\\"Waiting for caches to sync\\\\\\\" controller=\\\\\\\"RequestHeaderAuthRequestController\\\\\\\"\\\\nI1126 13:31:05.674448 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3700699131/tls.crt::/tmp/serving-cert-3700699131/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764163848\\\\\\\\\\\\\\\" (2025-11-26 13:30:48 +0000 UTC to 2025-11-26 13:30:49 +0000 UTC (now=2025-11-26 13:31:05.674394794 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674832 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764163859\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764163859\\\\\\\\\\\\\\\" (2025-11-26 12:30:59 +0000 UTC to 2028-11-26 12:30:59 +0000 UTC (now=2025-11-26 13:31:05.674797695 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674867 1 secure_serving.go:211] Serving securely on [::]:17697\\\\nI1126 13:31:05.674912 1 genericapiserver.go:696] [graceful-termination] waiting for shutdown to be initiated\\\\nI1126 13:31:05.675463 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1126 13:31:05.675787 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:47Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.950781 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61e52e-1d25-475f-ad1e-1cbe4e97250a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:36Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:41Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:45Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.961942 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35ae0687-3cb3-4330-a5f5-e4eb697ad531\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2972bf62a1ef13e7f8c6c519a94fb26031661adc9c603052a5068f56bb9c1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c65dab05adc289c78022bc965b740d22588cfb1841fa87f61c5e228ec9dfb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5b704a06a500a5cb328bb4264b9d15963d10f9fd659422f23576f24f4dfb76e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.969203 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.969361 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.969492 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.969816 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.969837 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:02 crc kubenswrapper[5200]: E1126 13:32:02.969954 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.973184 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.977330 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01080b46-74f1-4191-8755-5152a57b3b25" path="/var/lib/kubelet/pods/01080b46-74f1-4191-8755-5152a57b3b25/volumes" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.978181 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09cfa50b-4138-4585-a53e-64dd3ab73335" path="/var/lib/kubelet/pods/09cfa50b-4138-4585-a53e-64dd3ab73335/volumes" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.980609 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" path="/var/lib/kubelet/pods/0dd0fbac-8c0d-4228-8faa-abbeedabf7db/volumes" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.983861 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.985656 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0effdbcf-dd7d-404d-9d48-77536d665a5d" path="/var/lib/kubelet/pods/0effdbcf-dd7d-404d-9d48-77536d665a5d/volumes" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.992314 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.992377 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.992390 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.992410 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.992422 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:02Z","lastTransitionTime":"2025-11-26T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.994554 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:02 crc kubenswrapper[5200]: I1126 13:32:02.997571 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149b3c48-e17c-4a66-a835-d86dabf6ff13" path="/var/lib/kubelet/pods/149b3c48-e17c-4a66-a835-d86dabf6ff13/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.005788 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a3d103b2931a07f70607a8c2706de2607d8817ddebd8315799c1a35762dca29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.007903 5200 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2025-12-26 13:27:01 +0000 UTC" deadline="2025-12-22 13:42:21.456656167 +0000 UTC" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.007947 5200 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="624h10m18.448713893s" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.009453 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16bdd140-dce1-464c-ab47-dd5798d1d256" path="/var/lib/kubelet/pods/16bdd140-dce1-464c-ab47-dd5798d1d256/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.012245 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f80adb-c1c3-49ba-8ee4-932c851d3897" path="/var/lib/kubelet/pods/18f80adb-c1c3-49ba-8ee4-932c851d3897/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.014237 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed56afcc-296f-4f4a-9361-dd001f3490a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://9fa095bdb36662049f946bb69d771679aa4b828bc1a346509a6681deb74fd990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.020298 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" path="/var/lib/kubelet/pods/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.021084 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2325ffef-9d5b-447f-b00e-3efc429acefe" path="/var/lib/kubelet/pods/2325ffef-9d5b-447f-b00e-3efc429acefe/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.022854 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301e1965-1754-483d-b6cc-bfae7038bbca" path="/var/lib/kubelet/pods/301e1965-1754-483d-b6cc-bfae7038bbca/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.024421 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.024757 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fa8943-81cc-4750-a0b7-0fa9ab5af883" path="/var/lib/kubelet/pods/31fa8943-81cc-4750-a0b7-0fa9ab5af883/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.026906 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42a11a02-47e1-488f-b270-2679d3298b0e" path="/var/lib/kubelet/pods/42a11a02-47e1-488f-b270-2679d3298b0e/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.026916 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wqbhz" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.027744 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567683bd-0efc-4f21-b076-e28559628404" path="/var/lib/kubelet/pods/567683bd-0efc-4f21-b076-e28559628404/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.030670 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="584e1f4a-8205-47d7-8efb-3afc6017c4c9" path="/var/lib/kubelet/pods/584e1f4a-8205-47d7-8efb-3afc6017c4c9/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.031272 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593a3561-7760-45c5-8f91-5aaef7475d0f" path="/var/lib/kubelet/pods/593a3561-7760-45c5-8f91-5aaef7475d0f/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.032315 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebfebf6-3ecd-458e-943f-bb25b52e2718" path="/var/lib/kubelet/pods/5ebfebf6-3ecd-458e-943f-bb25b52e2718/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.037486 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6077b63e-53a2-4f96-9d56-1ce0324e4913" path="/var/lib/kubelet/pods/6077b63e-53a2-4f96-9d56-1ce0324e4913/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.041876 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" path="/var/lib/kubelet/pods/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.046856 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edfcf45-925b-4eff-b940-95b6fc0b85d4" path="/var/lib/kubelet/pods/6edfcf45-925b-4eff-b940-95b6fc0b85d4/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.052217 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.069879 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-wqbhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxd2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:32:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wqbhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.078494 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee8fbd3-1f81-4666-96da-5afc70819f1a" path="/var/lib/kubelet/pods/6ee8fbd3-1f81-4666-96da-5afc70819f1a/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.087905 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" path="/var/lib/kubelet/pods/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.090899 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eae436-2fa1-4b90-8656-d39061de1908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn98b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn98b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:32:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2nf9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.094620 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.094665 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.094676 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.094706 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.094717 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.102295 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.113501 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.121383 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.121750 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736c54fe-349c-4bb9-870a-d1c1d1c03831" path="/var/lib/kubelet/pods/736c54fe-349c-4bb9-870a-d1c1d1c03831/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.126380 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7599e0b6-bddf-4def-b7f2-0b32206e8651" path="/var/lib/kubelet/pods/7599e0b6-bddf-4def-b7f2-0b32206e8651/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.129150 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afa918d-be67-40a6-803c-d3b0ae99d815" path="/var/lib/kubelet/pods/7afa918d-be67-40a6-803c-d3b0ae99d815/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.130116 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df94c10-441d-4386-93a6-6730fb7bcde0" path="/var/lib/kubelet/pods/7df94c10-441d-4386-93a6-6730fb7bcde0/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.131334 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a3d103b2931a07f70607a8c2706de2607d8817ddebd8315799c1a35762dca29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.131917 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" path="/var/lib/kubelet/pods/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.137010 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e39f7b-62e4-4fc9-992a-6535ce127a02" path="/var/lib/kubelet/pods/81e39f7b-62e4-4fc9-992a-6535ce127a02/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.138468 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869851b9-7ffb-4af0-b166-1d8aa40a5f80" path="/var/lib/kubelet/pods/869851b9-7ffb-4af0-b166-1d8aa40a5f80/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.140579 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed56afcc-296f-4f4a-9361-dd001f3490a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://9fa095bdb36662049f946bb69d771679aa4b828bc1a346509a6681deb74fd990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.144396 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" path="/var/lib/kubelet/pods/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.145106 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92dfbade-90b6-4169-8c07-72cff7f2c82b" path="/var/lib/kubelet/pods/92dfbade-90b6-4169-8c07-72cff7f2c82b/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.152275 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.153403 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.161780 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a6e063-3d1a-4d44-875d-185291448c31" path="/var/lib/kubelet/pods/94a6e063-3d1a-4d44-875d-185291448c31/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.163163 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.163657 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f71a554-e414-4bc3-96d2-674060397afe" path="/var/lib/kubelet/pods/9f71a554-e414-4bc3-96d2-674060397afe/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: W1126 13:32:03.164394 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2eae436_2fa1_4b90_8656_d39061de1908.slice/crio-f790324b04b72c734b13897e3517d38117f324dfa83f1113d67eadfa09b9a0ed WatchSource:0}: Error finding container f790324b04b72c734b13897e3517d38117f324dfa83f1113d67eadfa09b9a0ed: Status 404 returned error can't find the container with id f790324b04b72c734b13897e3517d38117f324dfa83f1113d67eadfa09b9a0ed Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.171281 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-wqbhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxd2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:32:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wqbhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.171957 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a208c9c2-333b-4b4a-be0d-bc32ec38a821" path="/var/lib/kubelet/pods/a208c9c2-333b-4b4a-be0d-bc32ec38a821/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.177767 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" path="/var/lib/kubelet/pods/a52afe44-fb37-46ed-a1f8-bf39727a3cbe/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.181102 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eae436-2fa1-4b90-8656-d39061de1908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn98b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn98b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:32:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d2nf9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.184298 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a555ff2e-0be6-46d5-897d-863bb92ae2b3" path="/var/lib/kubelet/pods/a555ff2e-0be6-46d5-897d-863bb92ae2b3/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.184968 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a88189-c967-4640-879e-27665747f20c" path="/var/lib/kubelet/pods/a7a88189-c967-4640-879e-27665747f20c/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.185945 5200 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="af33e427-6803-48c2-a76a-dd9deb7cbf9a" path="/var/lib/kubelet/pods/af33e427-6803-48c2-a76a-dd9deb7cbf9a/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.187128 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af33e427-6803-48c2-a76a-dd9deb7cbf9a" path="/var/lib/kubelet/pods/af33e427-6803-48c2-a76a-dd9deb7cbf9a/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.197138 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.197198 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.197213 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.197234 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.197247 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.202346 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593def59-8f2e-4a01-b78d-8c3f74e9ff11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://1746d826a47c21d3cc09746edaf7eeb6f95f73606620de22b96d31b9b95bec65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://11f704fa5ebe39b96b7117ed3cffb65abc9476a8b103aefc1d7ba13fe242d938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ecc0962d6051b4fc487eead10265ecd461c4e86f250faeb378c47feb0c771f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://37fa273bd86bfd2d1c2bbf3f15c2f493e929f34ac48b2cba7dd8444901250f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e01a7101d5cdb794de4c9881f8b47a374f209769df4cfee65baf5519b44af722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:43Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.210729 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af41de71-79cf-4590-bbe9-9e8b848862cb" path="/var/lib/kubelet/pods/af41de71-79cf-4590-bbe9-9e8b848862cb/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.213980 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" path="/var/lib/kubelet/pods/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.228592 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4750666-1362-4001-abd0-6f89964cc621" path="/var/lib/kubelet/pods/b4750666-1362-4001-abd0-6f89964cc621/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.230812 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b605f283-6f2e-42da-a838-54421690f7d0" path="/var/lib/kubelet/pods/b605f283-6f2e-42da-a838-54421690f7d0/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.232817 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c491984c-7d4b-44aa-8c1e-d7974424fa47" path="/var/lib/kubelet/pods/c491984c-7d4b-44aa-8c1e-d7974424fa47/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.234836 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d60e4b9f-fed8-4c81-882c-33ac5ef7275b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:42Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"S_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 13:31:05.665766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 13:31:05.665770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 13:31:05.665773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 13:31:05.665776 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 13:31:05.665810 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1126 13:31:05.674255 1 requestheader_controller.go:180] Starting RequestHeaderAuthRequestController\\\\nI1126 13:31:05.674350 1 shared_informer.go:350] \\\\\\\"Waiting for caches to sync\\\\\\\" controller=\\\\\\\"RequestHeaderAuthRequestController\\\\\\\"\\\\nI1126 13:31:05.674448 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3700699131/tls.crt::/tmp/serving-cert-3700699131/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764163848\\\\\\\\\\\\\\\" (2025-11-26 13:30:48 +0000 UTC to 2025-11-26 13:30:49 +0000 UTC (now=2025-11-26 13:31:05.674394794 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674832 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764163859\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764163859\\\\\\\\\\\\\\\" (2025-11-26 12:30:59 +0000 UTC to 2028-11-26 12:30:59 +0000 UTC (now=2025-11-26 13:31:05.674797695 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674867 1 secure_serving.go:211] Serving securely on [::]:17697\\\\nI1126 13:31:05.674912 1 genericapiserver.go:696] [graceful-termination] waiting for shutdown to be initiated\\\\nI1126 13:31:05.675463 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1126 13:31:05.675787 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:47Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.239380 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f2bfad-70f6-4185-a3d9-81ce12720767" path="/var/lib/kubelet/pods/c5f2bfad-70f6-4185-a3d9-81ce12720767/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.241751 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc85e424-18b2-4924-920b-bd291a8c4b01" path="/var/lib/kubelet/pods/cc85e424-18b2-4924-920b-bd291a8c4b01/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.243679 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce090a97-9ab6-4c40-a719-64ff2acd9778" path="/var/lib/kubelet/pods/ce090a97-9ab6-4c40-a719-64ff2acd9778/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.248186 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19cb085-0c5b-4810-b654-ce7923221d90" path="/var/lib/kubelet/pods/d19cb085-0c5b-4810-b654-ce7923221d90/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.250467 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" path="/var/lib/kubelet/pods/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.253296 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d565531a-ff86-4608-9d19-767de01ac31b" path="/var/lib/kubelet/pods/d565531a-ff86-4608-9d19-767de01ac31b/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.254337 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e8f42f-dc0e-424b-bb56-5ec849834888" path="/var/lib/kubelet/pods/d7e8f42f-dc0e-424b-bb56-5ec849834888/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.256567 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" path="/var/lib/kubelet/pods/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.257816 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e093be35-bb62-4843-b2e8-094545761610" path="/var/lib/kubelet/pods/e093be35-bb62-4843-b2e8-094545761610/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.259343 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" path="/var/lib/kubelet/pods/e1d2a42d-af1d-4054-9618-ab545e0ed8b7/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.260547 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f559dfa3-3917-43a2-97f6-61ddfda10e93" path="/var/lib/kubelet/pods/f559dfa3-3917-43a2-97f6-61ddfda10e93/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.270665 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65c0ac1-8bca-454d-a2e6-e35cb418beac" path="/var/lib/kubelet/pods/f65c0ac1-8bca-454d-a2e6-e35cb418beac/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.271855 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" path="/var/lib/kubelet/pods/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.279528 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61e52e-1d25-475f-ad1e-1cbe4e97250a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:36Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:41Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:45Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.280622 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e2c886-118e-43bb-bef1-c78134de392b" path="/var/lib/kubelet/pods/f7e2c886-118e-43bb-bef1-c78134de392b/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.287232 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" path="/var/lib/kubelet/pods/fc8db2c7-859d-47b3-a900-2bd0c0b2973b/volumes" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.288575 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.288639 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.288658 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"00d970c5a8dd87190beb8624ca269ced9fcc976e2b902f93e2d1ac905e607e0d"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.288676 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5rzd7"] Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.300045 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.300095 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.300109 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.300132 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.300143 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.308429 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35ae0687-3cb3-4330-a5f5-e4eb697ad531\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2972bf62a1ef13e7f8c6c519a94fb26031661adc9c603052a5068f56bb9c1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c65dab05adc289c78022bc965b740d22588cfb1841fa87f61c5e228ec9dfb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5b704a06a500a5cb328bb4264b9d15963d10f9fd659422f23576f24f4dfb76e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.323160 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.348738 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wqbhz" event={"ID":"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8","Type":"ContainerStarted","Data":"ea2b32fe96975b4a6e42d85fd910634353b569424a51502ec3ecca03084b76be"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.348812 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mc24w"] Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.349027 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.352311 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.352736 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593def59-8f2e-4a01-b78d-8c3f74e9ff11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://1746d826a47c21d3cc09746edaf7eeb6f95f73606620de22b96d31b9b95bec65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://11f704fa5ebe39b96b7117ed3cffb65abc9476a8b103aefc1d7ba13fe242d938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ecc0962d6051b4fc487eead10265ecd461c4e86f250faeb378c47feb0c771f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://37fa273bd86bfd2d1c2bbf3f15c2f493e929f34ac48b2cba7dd8444901250f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e01a7101d5cdb794de4c9881f8b47a374f209769df4cfee65baf5519b44af722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:43Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97e7a9da3af7c814388e0e139392a6b943de742f4e5319fc8f6d5279701c23f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68897077a3cd68c0f9ba5c41f86d213dfad4ac055b1a3ba9e745393513914cf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aded617d70bbb37fa9b66f6f9107f15ee6a5e0f3e7e63a6fe11f0ac02cacfe71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.352865 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.352994 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.353044 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.353073 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.353161 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.354722 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l2v2m\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.358958 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4bsk9"] Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.359308 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.361323 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.361414 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.361564 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.361738 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.361782 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g6kgg\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.368523 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d60e4b9f-fed8-4c81-882c-33ac5ef7275b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:46Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:42Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"S_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 13:31:05.665766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 13:31:05.665770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 13:31:05.665773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 13:31:05.665776 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 13:31:05.665810 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1126 13:31:05.674255 1 requestheader_controller.go:180] Starting RequestHeaderAuthRequestController\\\\nI1126 13:31:05.674350 1 shared_informer.go:350] \\\\\\\"Waiting for caches to sync\\\\\\\" controller=\\\\\\\"RequestHeaderAuthRequestController\\\\\\\"\\\\nI1126 13:31:05.674448 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3700699131/tls.crt::/tmp/serving-cert-3700699131/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764163848\\\\\\\\\\\\\\\" (2025-11-26 13:30:48 +0000 UTC to 2025-11-26 13:30:49 +0000 UTC (now=2025-11-26 13:31:05.674394794 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674832 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764163859\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764163859\\\\\\\\\\\\\\\" (2025-11-26 12:30:59 +0000 UTC to 2028-11-26 12:30:59 +0000 UTC (now=2025-11-26 13:31:05.674797695 +0000 UTC))\\\\\\\"\\\\nI1126 13:31:05.674867 1 secure_serving.go:211] Serving securely on [::]:17697\\\\nI1126 13:31:05.674912 1 genericapiserver.go:696] [graceful-termination] waiting for shutdown to be initiated\\\\nI1126 13:31:05.675463 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1126 13:31:05.675787 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:47Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.382243 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61e52e-1d25-475f-ad1e-1cbe4e97250a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://96b3bcad398baf70222d8f3654c34bc744f12910beda5cba42946618fa922434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:36Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://0273e9bd9120ac62c8f1ea1945e8a249fce57d37e538541ba208a4c3d297f20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:41Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://fa0c43e4477c9e16863db36878b1ffa988b8e843f139a6ac2502b1f22f8607e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:45Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.384063 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.385798 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nwglk\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.386436 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.387655 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.400597 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35ae0687-3cb3-4330-a5f5-e4eb697ad531\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2972bf62a1ef13e7f8c6c519a94fb26031661adc9c603052a5068f56bb9c1c64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c65dab05adc289c78022bc965b740d22588cfb1841fa87f61c5e228ec9dfb2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5b704a06a500a5cb328bb4264b9d15963d10f9fd659422f23576f24f4dfb76e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:38Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e19ca590ca3352dc2168bbc11833d9d33032b338e9241dc33f9a19fa8a7273\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.403775 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.403821 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.403836 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.403853 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.403866 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413675 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-log-socket\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413750 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-kubelet\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413812 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-slash\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413833 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-node-log\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413861 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413885 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-config\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413906 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-systemd\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413929 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413954 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-netns\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.413987 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-var-lib-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414007 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-systemd-units\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414028 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-etc-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414061 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88zw\" (UniqueName: \"kubernetes.io/projected/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-kube-api-access-n88zw\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414091 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-netd\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414113 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-script-lib\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414135 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-bin\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414160 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovn-node-metrics-cert\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414180 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414211 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-ovn\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.414234 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-env-overrides\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.441443 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.485990 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.506063 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.506109 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.506121 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.506140 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.506153 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515334 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-env-overrides\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515378 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-cni-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515395 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-cnibin\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515412 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl9bd\" (UniqueName: \"kubernetes.io/projected/1c236a7a-7975-4f6e-8a56-07203c1833ef-kube-api-access-cl9bd\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515430 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-log-socket\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515447 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-daemon-config\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515470 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f6a6875-f90c-438b-83a3-0fccba5ff947-cni-binary-copy\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515489 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-kubelet\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515511 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-slash\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515530 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-node-log\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515546 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515566 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-config\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515592 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-socket-dir-parent\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515621 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-systemd\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515710 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-log-socket\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515740 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-cni-multus\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515762 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515780 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-netns\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515798 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515817 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-system-cni-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515894 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-hostroot\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515910 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-netns\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515925 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-conf-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515943 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-etc-kubernetes\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515970 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-var-lib-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.515986 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-multus-certs\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516002 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-cnibin\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516019 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516035 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-systemd-units\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516053 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-etc-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516082 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-kubelet\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516102 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-slash\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516125 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-systemd\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516158 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-node-log\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516179 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516378 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-config\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516523 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-env-overrides\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516558 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516591 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-netns\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516626 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-var-lib-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516665 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-systemd-units\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516710 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-etc-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516736 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-k8s-cni-cncf-io\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516758 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xsm8\" (UniqueName: \"kubernetes.io/projected/1f6a6875-f90c-438b-83a3-0fccba5ff947-kube-api-access-4xsm8\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516790 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n88zw\" (UniqueName: \"kubernetes.io/projected/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-kube-api-access-n88zw\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516810 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516950 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-netd\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516970 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-script-lib\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.516985 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-cni-bin\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517021 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-os-release\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517049 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-netd\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517326 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-bin\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517353 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovn-node-metrics-cert\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517371 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-os-release\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517409 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517436 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517464 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-bin\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517482 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-script-lib\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517519 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-openvswitch\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517570 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-ovn\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517590 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-kubelet\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517603 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-system-cni-dir\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.517652 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-ovn\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.524225 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://00d970c5a8dd87190beb8624ca269ced9fcc976e2b902f93e2d1ac905e607e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0,1000500000],\\\"uid\\\":1000500000}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://e246394f8c4438ea4816c401e4608482b7bb58a72e7bc512f28947888981b8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0,1000500000],\\\"uid\\\":1000500000}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.526956 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovn-node-metrics-cert\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.565092 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88zw\" (UniqueName: \"kubernetes.io/projected/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-kube-api-access-n88zw\") pod \"ovnkube-node-5rzd7\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.590420 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T13:32:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a3d103b2931a07f70607a8c2706de2607d8817ddebd8315799c1a35762dca29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:32:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.608364 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.608431 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.608444 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.608468 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.608480 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619180 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-daemon-config\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619236 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f6a6875-f90c-438b-83a3-0fccba5ff947-cni-binary-copy\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619258 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-socket-dir-parent\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619275 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-cni-multus\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619330 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-cni-multus\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619549 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-netns\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619581 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619727 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-netns\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619751 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-system-cni-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619812 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-hostroot\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619841 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-conf-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619868 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-etc-kubernetes\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619900 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-system-cni-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619950 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-multus-certs\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619951 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-socket-dir-parent\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619982 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-cnibin\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619995 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-conf-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.619935 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-hostroot\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620006 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-etc-kubernetes\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620011 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620061 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-cnibin\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620028 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-multus-certs\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620134 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-k8s-cni-cncf-io\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620170 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-run-k8s-cni-cncf-io\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620189 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xsm8\" (UniqueName: \"kubernetes.io/projected/1f6a6875-f90c-438b-83a3-0fccba5ff947-kube-api-access-4xsm8\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620250 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620301 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-cni-bin\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620327 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-os-release\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620338 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-daemon-config\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620360 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-os-release\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620385 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620341 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f6a6875-f90c-438b-83a3-0fccba5ff947-cni-binary-copy\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620440 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620460 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-cni-bin\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620484 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-kubelet\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620490 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-os-release\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620445 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-host-var-lib-kubelet\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620515 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620516 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-os-release\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620556 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-system-cni-dir\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620590 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c236a7a-7975-4f6e-8a56-07203c1833ef-system-cni-dir\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620611 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-cni-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620640 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-cnibin\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620666 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl9bd\" (UniqueName: \"kubernetes.io/projected/1c236a7a-7975-4f6e-8a56-07203c1833ef-kube-api-access-cl9bd\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620764 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-cnibin\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620845 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.620929 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f6a6875-f90c-438b-83a3-0fccba5ff947-multus-cni-dir\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.621137 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1c236a7a-7975-4f6e-8a56-07203c1833ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.624897 5200 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed56afcc-296f-4f4a-9361-dd001f3490a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T13:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://9fa095bdb36662049f946bb69d771679aa4b828bc1a346509a6681deb74fd990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T13:30:37Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dddcc8883ecc47405b5205742e29033d0897af74b80fbd88428a37350d11b48c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T13:30:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T13:30:35Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T13:30:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.651554 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xsm8\" (UniqueName: \"kubernetes.io/projected/1f6a6875-f90c-438b-83a3-0fccba5ff947-kube-api-access-4xsm8\") pod \"multus-mc24w\" (UID: \"1f6a6875-f90c-438b-83a3-0fccba5ff947\") " pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.668986 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.671950 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl9bd\" (UniqueName: \"kubernetes.io/projected/1c236a7a-7975-4f6e-8a56-07203c1833ef-kube-api-access-cl9bd\") pod \"multus-additional-cni-plugins-4bsk9\" (UID: \"1c236a7a-7975-4f6e-8a56-07203c1833ef\") " pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.675858 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mc24w" Nov 26 13:32:03 crc kubenswrapper[5200]: W1126 13:32:03.685531 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1ef9b40_5d21_4743_a050_2b1fa0d95b7a.slice/crio-02d3a2db83c2a4dd09e74572f0d6e829dd4fc10f4e8dc1da71dea195f6a6bdb8 WatchSource:0}: Error finding container 02d3a2db83c2a4dd09e74572f0d6e829dd4fc10f4e8dc1da71dea195f6a6bdb8: Status 404 returned error can't find the container with id 02d3a2db83c2a4dd09e74572f0d6e829dd4fc10f4e8dc1da71dea195f6a6bdb8 Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.710619 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.711238 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.711277 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.711287 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.711302 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.711312 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.816057 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.816095 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.816110 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.816130 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.816142 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.921164 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.921210 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.921223 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.921241 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.921252 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:03Z","lastTransitionTime":"2025-11-26T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.944991 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.944975399 podStartE2EDuration="2.944975399s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:03.944512016 +0000 UTC m=+92.623713864" watchObservedRunningTime="2025-11-26 13:32:03.944975399 +0000 UTC m=+92.624177217" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.966151 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.9661294209999998 podStartE2EDuration="2.966129421s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:03.965655018 +0000 UTC m=+92.644856846" watchObservedRunningTime="2025-11-26 13:32:03.966129421 +0000 UTC m=+92.645331239" Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.972409 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-2mkdc"] Nov 26 13:32:03 crc kubenswrapper[5200]: I1126 13:32:03.994823 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.015639 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.023171 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.023421 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.023550 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.023626 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.023703 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.035830 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.054776 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tjs74\"" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.075433 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.126121 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-host\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.126165 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-serviceca\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.126175 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.126208 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.126222 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.126222 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cslsc\" (UniqueName: \"kubernetes.io/projected/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-kube-api-access-cslsc\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.126237 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.126249 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.132016 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wqbhz" event={"ID":"cb07d719-2ce6-4c5e-8ef6-f3455c165ea8","Type":"ContainerStarted","Data":"67d0262b84cf6170ea2d5d633a43124834ab463d457d578931028de0b46405b3"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.133576 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261" exitCode=0 Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.133654 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.133693 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"02d3a2db83c2a4dd09e74572f0d6e829dd4fc10f4e8dc1da71dea195f6a6bdb8"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.135351 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerStarted","Data":"2433c507ae5e9756a135a65359780134b36b9992383c8f33eac6c70818ec4615"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.135377 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerStarted","Data":"19139953c36f99cf1494fb713371e25bbb06c87f2f3e9ec2b2f761b17a0561af"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.136516 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc24w" event={"ID":"1f6a6875-f90c-438b-83a3-0fccba5ff947","Type":"ContainerStarted","Data":"9ea63010c351847083a3776fd49e496c4cadfc2912cb79add611f3746bb1de86"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.136546 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc24w" event={"ID":"1f6a6875-f90c-438b-83a3-0fccba5ff947","Type":"ContainerStarted","Data":"d24d99fd985eef0161afb76896065efbba33b1f75009dba7370088d125ee19fb"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.138462 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"48606e122e03d27acea134bfc75ddba6f0467e128de39b28da1617f642612a01"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.138529 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"a44e89797844111bb8aded93f9b56ff9fb33b44e5ce546ddc59c7e9f7e94a788"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.138542 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"f790324b04b72c734b13897e3517d38117f324dfa83f1113d67eadfa09b9a0ed"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.191529 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=3.191504667 podStartE2EDuration="3.191504667s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:04.191484006 +0000 UTC m=+92.870685834" watchObservedRunningTime="2025-11-26 13:32:04.191504667 +0000 UTC m=+92.870706495" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.227772 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-host\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.227826 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-serviceca\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.229198 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-serviceca\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.230678 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cslsc\" (UniqueName: \"kubernetes.io/projected/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-kube-api-access-cslsc\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.230902 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-host\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.231111 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.231153 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.231164 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.231182 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.231196 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.262222 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cslsc\" (UniqueName: \"kubernetes.io/projected/17ec8ea6-c629-4177-ad4f-bca9f5e926ac-kube-api-access-cslsc\") pod \"node-ca-2mkdc\" (UID: \"17ec8ea6-c629-4177-ad4f-bca9f5e926ac\") " pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.268823 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=3.268803539 podStartE2EDuration="3.268803539s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:04.24201685 +0000 UTC m=+92.921218678" watchObservedRunningTime="2025-11-26 13:32:04.268803539 +0000 UTC m=+92.948005357" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.269263 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp"] Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.303568 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.303543811 podStartE2EDuration="3.303543811s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:04.295383583 +0000 UTC m=+92.974585421" watchObservedRunningTime="2025-11-26 13:32:04.303543811 +0000 UTC m=+92.982745629" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.324734 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-scmch"] Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.324900 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.332381 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2mkdc" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.333976 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.334013 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.334024 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.334040 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.334070 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.339817 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-nl8tp\"" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.341452 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.341551 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-scmch" podUID="aff96734-e4bb-4642-9207-9293cce04a16" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.355360 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\"" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.435352 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.435401 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8klvc\" (UniqueName: \"kubernetes.io/projected/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-kube-api-access-8klvc\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.435431 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.435459 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.435477 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.435492 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjch8\" (UniqueName: \"kubernetes.io/projected/aff96734-e4bb-4642-9207-9293cce04a16-kube-api-access-sjch8\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.438002 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.438032 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.438040 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.438053 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.438062 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.536419 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.536486 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8klvc\" (UniqueName: \"kubernetes.io/projected/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-kube-api-access-8klvc\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.536527 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.536560 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.536606 5200 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.536719 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs podName:aff96734-e4bb-4642-9207-9293cce04a16 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:05.036677464 +0000 UTC m=+93.715879282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs") pod "network-metrics-daemon-scmch" (UID: "aff96734-e4bb-4642-9207-9293cce04a16") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.537788 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.537883 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.538303 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.538493 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjch8\" (UniqueName: \"kubernetes.io/projected/aff96734-e4bb-4642-9207-9293cce04a16-kube-api-access-sjch8\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.541333 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podStartSLOduration=2.5413167039999998 podStartE2EDuration="2.541316704s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:04.537202618 +0000 UTC m=+93.216404436" watchObservedRunningTime="2025-11-26 13:32:04.541316704 +0000 UTC m=+93.220518512" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.543258 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.545858 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.545891 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.545901 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.545915 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.545924 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.563087 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8klvc\" (UniqueName: \"kubernetes.io/projected/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-kube-api-access-8klvc\") pod \"ovnkube-control-plane-57b78d8988-j5jsp\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.610334 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjch8\" (UniqueName: \"kubernetes.io/projected/aff96734-e4bb-4642-9207-9293cce04a16-kube-api-access-sjch8\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.639553 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.648315 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.648362 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.648376 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.648397 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.648408 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.672354 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mc24w" podStartSLOduration=2.672333649 podStartE2EDuration="2.672333649s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:04.670708444 +0000 UTC m=+93.349910272" watchObservedRunningTime="2025-11-26 13:32:04.672333649 +0000 UTC m=+93.351535467" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.751132 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.751178 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.751189 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.751206 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.751219 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.843076 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.843241 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:08.8432039 +0000 UTC m=+97.522405758 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.854418 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.854482 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.854518 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.854538 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.854554 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.944187 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.944603 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.944629 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.944658 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.944461 5200 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.944942 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:08.944924246 +0000 UTC m=+97.624126064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.944845 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945349 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945364 5200 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945397 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:08.945387479 +0000 UTC m=+97.624589297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945411 5200 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945509 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945517 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:08.945497712 +0000 UTC m=+97.624699530 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945533 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945550 5200 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.945585 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:08.945575814 +0000 UTC m=+97.624777672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.958538 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.958581 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.958618 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.958635 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.958646 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:04Z","lastTransitionTime":"2025-11-26T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.969002 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.969061 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:04 crc kubenswrapper[5200]: I1126 13:32:04.969084 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.969177 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.969252 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:04 crc kubenswrapper[5200]: E1126 13:32:04.969509 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.045726 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:05 crc kubenswrapper[5200]: E1126 13:32:05.045995 5200 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:05 crc kubenswrapper[5200]: E1126 13:32:05.046105 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs podName:aff96734-e4bb-4642-9207-9293cce04a16 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:06.046077426 +0000 UTC m=+94.725279294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs") pod "network-metrics-daemon-scmch" (UID: "aff96734-e4bb-4642-9207-9293cce04a16") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.095013 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.095050 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.095060 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.095075 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.095084 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.144767 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" event={"ID":"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635","Type":"ContainerStarted","Data":"38aee41b2193e192d892c686e30d2a83e5a97ec448afd4bfa866bcabc4ae29ee"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.149503 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.149611 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.149636 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.151827 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" event={"ID":"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc","Type":"ContainerStarted","Data":"0ebbcdbe31b67fe0c867a51f478ab484882d3d072062c6beec7545c2d3c2d625"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.156015 5200 generic.go:358] "Generic (PLEG): container finished" podID="1c236a7a-7975-4f6e-8a56-07203c1833ef" containerID="2433c507ae5e9756a135a65359780134b36b9992383c8f33eac6c70818ec4615" exitCode=0 Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.156077 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerDied","Data":"2433c507ae5e9756a135a65359780134b36b9992383c8f33eac6c70818ec4615"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.160423 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2mkdc" event={"ID":"17ec8ea6-c629-4177-ad4f-bca9f5e926ac","Type":"ContainerStarted","Data":"555a911dfcc6137ca755f625c025d9c79636b5b0c66f39f24417089f7e61737b"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.160489 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2mkdc" event={"ID":"17ec8ea6-c629-4177-ad4f-bca9f5e926ac","Type":"ContainerStarted","Data":"63f60719fe1be9a3d98208664b07d633c8f15fb3b631e9155b69ef45726a4cf8"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.166907 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wqbhz" podStartSLOduration=3.1668864660000002 podStartE2EDuration="3.166886466s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:04.764645952 +0000 UTC m=+93.443847780" watchObservedRunningTime="2025-11-26 13:32:05.166886466 +0000 UTC m=+93.846088284" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.201471 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.201511 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.201520 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.201535 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.201547 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.225673 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2mkdc" podStartSLOduration=3.22565463 podStartE2EDuration="3.22565463s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:05.220601748 +0000 UTC m=+93.899803576" watchObservedRunningTime="2025-11-26 13:32:05.22565463 +0000 UTC m=+93.904856438" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.309907 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.309953 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.309963 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.309978 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.309988 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.412170 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.412214 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.412225 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.412243 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.412254 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.514754 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.514805 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.514820 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.514846 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.514857 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.617027 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.617067 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.617077 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.617091 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.617100 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.719422 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.719458 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.719468 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.719481 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.719491 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.822475 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.822739 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.822749 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.822763 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.822773 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.925758 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.925795 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.925806 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.925828 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.925837 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:05Z","lastTransitionTime":"2025-11-26T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:05 crc kubenswrapper[5200]: I1126 13:32:05.968346 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:05 crc kubenswrapper[5200]: E1126 13:32:05.968494 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-scmch" podUID="aff96734-e4bb-4642-9207-9293cce04a16" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.027640 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.027675 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.027701 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.027715 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.027724 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.056495 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:06 crc kubenswrapper[5200]: E1126 13:32:06.056649 5200 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:06 crc kubenswrapper[5200]: E1126 13:32:06.056736 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs podName:aff96734-e4bb-4642-9207-9293cce04a16 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:08.056717891 +0000 UTC m=+96.735919709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs") pod "network-metrics-daemon-scmch" (UID: "aff96734-e4bb-4642-9207-9293cce04a16") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.130770 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.131286 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.131307 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.131327 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.131341 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.178809 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerStarted","Data":"1fb7e9db6c30c1e30098890c02fb4d3574af7b1ff343859b28c15d6e17a5dca2"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.187614 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" event={"ID":"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635","Type":"ContainerStarted","Data":"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.187735 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" event={"ID":"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635","Type":"ContainerStarted","Data":"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.196441 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.196864 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.197054 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.217121 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" podStartSLOduration=4.217086308 podStartE2EDuration="4.217086308s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:06.21643428 +0000 UTC m=+94.895636108" watchObservedRunningTime="2025-11-26 13:32:06.217086308 +0000 UTC m=+94.896288136" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.233780 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.233840 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.233857 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.233882 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.233897 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.335995 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.336051 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.336064 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.336084 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.336098 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.438544 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.438588 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.438596 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.438611 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.438619 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.541726 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.541766 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.541775 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.541790 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.541800 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.644084 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.644138 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.644153 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.644174 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.644186 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.746575 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.746618 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.746627 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.746642 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.746650 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.848738 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.849100 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.849116 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.849134 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.849153 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.952129 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.952175 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.952184 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.952199 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.952210 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:06Z","lastTransitionTime":"2025-11-26T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.968631 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.968669 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:06 crc kubenswrapper[5200]: E1126 13:32:06.968780 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:06 crc kubenswrapper[5200]: E1126 13:32:06.968883 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:06 crc kubenswrapper[5200]: I1126 13:32:06.968965 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:06 crc kubenswrapper[5200]: E1126 13:32:06.969025 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.055233 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.055281 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.055293 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.055309 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.055320 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.157727 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.157774 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.157784 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.157798 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.157809 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.201749 5200 generic.go:358] "Generic (PLEG): container finished" podID="1c236a7a-7975-4f6e-8a56-07203c1833ef" containerID="1fb7e9db6c30c1e30098890c02fb4d3574af7b1ff343859b28c15d6e17a5dca2" exitCode=0 Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.202182 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerDied","Data":"1fb7e9db6c30c1e30098890c02fb4d3574af7b1ff343859b28c15d6e17a5dca2"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.260944 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.261009 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.261424 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.261789 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.261801 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.371443 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.371511 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.371526 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.371554 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.371569 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.477314 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.477362 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.477381 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.477398 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.477409 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.579250 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.579305 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.579319 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.579338 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.579351 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.681268 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.681326 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.681339 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.681361 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.681375 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.784073 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.784142 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.784156 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.784178 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.784192 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.886555 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.887126 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.887141 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.887161 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.887174 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.969242 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:07 crc kubenswrapper[5200]: E1126 13:32:07.969400 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-scmch" podUID="aff96734-e4bb-4642-9207-9293cce04a16" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.993869 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.994129 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.994148 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.994201 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:07 crc kubenswrapper[5200]: I1126 13:32:07.994217 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:07Z","lastTransitionTime":"2025-11-26T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.078576 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.078816 5200 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.078906 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs podName:aff96734-e4bb-4642-9207-9293cce04a16 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:12.078888376 +0000 UTC m=+100.758090194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs") pod "network-metrics-daemon-scmch" (UID: "aff96734-e4bb-4642-9207-9293cce04a16") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.097563 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.097615 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.097630 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.097654 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.097668 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.200667 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.200743 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.200755 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.200775 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.200790 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.208241 5200 generic.go:358] "Generic (PLEG): container finished" podID="1c236a7a-7975-4f6e-8a56-07203c1833ef" containerID="250b8e4593ed092fe1270378718b49456e97d1f24f636082893142fca5580f48" exitCode=0 Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.208370 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerDied","Data":"250b8e4593ed092fe1270378718b49456e97d1f24f636082893142fca5580f48"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.212233 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.304632 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.304674 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.304701 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.304716 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.304728 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.406969 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.407028 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.407040 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.407062 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.407076 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.509454 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.509519 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.509533 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.509554 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.509567 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.611940 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.611981 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.611992 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.612005 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.612018 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.714048 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.714103 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.714119 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.714137 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.714151 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.817908 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.817955 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.817967 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.817983 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.817995 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.889170 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.889392 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:16.889360521 +0000 UTC m=+105.568562339 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.920921 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.920957 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.920967 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.920982 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.920993 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:08Z","lastTransitionTime":"2025-11-26T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.970168 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.970213 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.970239 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.970341 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.970412 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.970533 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.990374 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.990450 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.990511 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:08 crc kubenswrapper[5200]: I1126 13:32:08.990586 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990596 5200 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990712 5200 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990765 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990768 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990811 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:16.990671006 +0000 UTC m=+105.669872824 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990811 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990831 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:16.99082174 +0000 UTC m=+105.670023558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990843 5200 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990787 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990924 5200 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990904 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:16.990885022 +0000 UTC m=+105.670086850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:08 crc kubenswrapper[5200]: E1126 13:32:08.990979 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:16.990966314 +0000 UTC m=+105.670168132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.023056 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.023108 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.023117 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.023137 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.023147 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.125556 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.125608 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.125618 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.125636 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.125650 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.219251 5200 generic.go:358] "Generic (PLEG): container finished" podID="1c236a7a-7975-4f6e-8a56-07203c1833ef" containerID="676df2def554a43599a6071b6c4c4999921ec3a1742c07121dbf62f30e0fe67a" exitCode=0 Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.219375 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerDied","Data":"676df2def554a43599a6071b6c4c4999921ec3a1742c07121dbf62f30e0fe67a"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.236129 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.236178 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.236190 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.236211 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.236223 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.338870 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.338957 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.338982 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.339018 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.339040 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.442436 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.442491 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.442503 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.442521 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.442532 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.545594 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.545666 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.545714 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.545743 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.545765 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.648644 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.648732 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.648747 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.648769 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.648784 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.751774 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.751832 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.751843 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.751864 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.751876 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.854404 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.854490 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.854562 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.854587 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.854600 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.957515 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.957584 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.957599 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.957622 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.957636 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:09Z","lastTransitionTime":"2025-11-26T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:09 crc kubenswrapper[5200]: I1126 13:32:09.969363 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:09 crc kubenswrapper[5200]: E1126 13:32:09.969654 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-scmch" podUID="aff96734-e4bb-4642-9207-9293cce04a16" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.061126 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.061232 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.061261 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.061300 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.061333 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.164611 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.164852 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.164869 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.164888 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.164900 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.269613 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.270134 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.270152 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.270171 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.270184 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.374005 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.374060 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.374075 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.374094 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.374104 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.476958 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.477022 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.477038 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.477060 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.477074 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.580566 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.580644 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.580660 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.580708 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.580728 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.683634 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.683708 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.683723 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.683743 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.683756 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.786768 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.786835 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.786848 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.786874 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.786888 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.889734 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.889783 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.889794 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.889810 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.889820 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.969197 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.969197 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:10 crc kubenswrapper[5200]: E1126 13:32:10.969393 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:10 crc kubenswrapper[5200]: E1126 13:32:10.969437 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.969462 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:10 crc kubenswrapper[5200]: E1126 13:32:10.969516 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.992903 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.993563 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.993675 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.993794 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:10 crc kubenswrapper[5200]: I1126 13:32:10.993892 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:10Z","lastTransitionTime":"2025-11-26T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.052737 5200 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.096007 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.096083 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.096102 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.096135 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.096151 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:11Z","lastTransitionTime":"2025-11-26T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.200759 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.201190 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.201453 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.201706 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.201859 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:11Z","lastTransitionTime":"2025-11-26T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.249152 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerStarted","Data":"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec"} Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.250500 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.250741 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.250911 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.276741 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerStarted","Data":"d3327bd5a90db6bc8c3da5d153a79e05ae322b0aab9f949654e19dba2bb2b1bd"} Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.290526 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.294299 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.298502 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podStartSLOduration=9.298488242 podStartE2EDuration="9.298488242s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:11.295831957 +0000 UTC m=+99.975033815" watchObservedRunningTime="2025-11-26 13:32:11.298488242 +0000 UTC m=+99.977690070" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.305635 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.305835 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.305926 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.305966 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.305986 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:11Z","lastTransitionTime":"2025-11-26T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.409667 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.409768 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.409789 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.409814 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.409831 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:11Z","lastTransitionTime":"2025-11-26T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.462672 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.463217 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.463232 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.463250 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.463262 5200 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T13:32:11Z","lastTransitionTime":"2025-11-26T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.522108 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9"] Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.536319 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.539509 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\"" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.539654 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"default-dockercfg-hqpm5\"" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.539803 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\"" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.539975 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\"" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.624888 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e061399a-a050-4f2e-bad0-7174a3b8a060-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.624957 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e061399a-a050-4f2e-bad0-7174a3b8a060-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.625107 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e061399a-a050-4f2e-bad0-7174a3b8a060-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.625156 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e061399a-a050-4f2e-bad0-7174a3b8a060-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.625186 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e061399a-a050-4f2e-bad0-7174a3b8a060-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.725773 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e061399a-a050-4f2e-bad0-7174a3b8a060-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.725836 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e061399a-a050-4f2e-bad0-7174a3b8a060-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.725976 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e061399a-a050-4f2e-bad0-7174a3b8a060-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.726058 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e061399a-a050-4f2e-bad0-7174a3b8a060-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.726107 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e061399a-a050-4f2e-bad0-7174a3b8a060-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.726134 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e061399a-a050-4f2e-bad0-7174a3b8a060-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.726224 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e061399a-a050-4f2e-bad0-7174a3b8a060-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.727234 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e061399a-a050-4f2e-bad0-7174a3b8a060-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.738805 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e061399a-a050-4f2e-bad0-7174a3b8a060-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.745094 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e061399a-a050-4f2e-bad0-7174a3b8a060-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-vsms9\" (UID: \"e061399a-a050-4f2e-bad0-7174a3b8a060\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.858667 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" Nov 26 13:32:11 crc kubenswrapper[5200]: W1126 13:32:11.883744 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode061399a_a050_4f2e_bad0_7174a3b8a060.slice/crio-c0c7aab68703319f4138629e984728f9229287b09aa73ab86c21f2c5188e909b WatchSource:0}: Error finding container c0c7aab68703319f4138629e984728f9229287b09aa73ab86c21f2c5188e909b: Status 404 returned error can't find the container with id c0c7aab68703319f4138629e984728f9229287b09aa73ab86c21f2c5188e909b Nov 26 13:32:11 crc kubenswrapper[5200]: I1126 13:32:11.969251 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:11 crc kubenswrapper[5200]: E1126 13:32:11.969421 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-scmch" podUID="aff96734-e4bb-4642-9207-9293cce04a16" Nov 26 13:32:12 crc kubenswrapper[5200]: I1126 13:32:12.131554 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:12 crc kubenswrapper[5200]: E1126 13:32:12.131769 5200 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:12 crc kubenswrapper[5200]: E1126 13:32:12.131863 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs podName:aff96734-e4bb-4642-9207-9293cce04a16 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:20.131838627 +0000 UTC m=+108.811040435 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs") pod "network-metrics-daemon-scmch" (UID: "aff96734-e4bb-4642-9207-9293cce04a16") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 13:32:12 crc kubenswrapper[5200]: I1126 13:32:12.284538 5200 generic.go:358] "Generic (PLEG): container finished" podID="1c236a7a-7975-4f6e-8a56-07203c1833ef" containerID="d3327bd5a90db6bc8c3da5d153a79e05ae322b0aab9f949654e19dba2bb2b1bd" exitCode=0 Nov 26 13:32:12 crc kubenswrapper[5200]: I1126 13:32:12.284663 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerDied","Data":"d3327bd5a90db6bc8c3da5d153a79e05ae322b0aab9f949654e19dba2bb2b1bd"} Nov 26 13:32:12 crc kubenswrapper[5200]: I1126 13:32:12.287217 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" event={"ID":"e061399a-a050-4f2e-bad0-7174a3b8a060","Type":"ContainerStarted","Data":"3ec44c0796bdad36954eda0df18487580ebadbd2dda219b757cda6f9abd2003e"} Nov 26 13:32:12 crc kubenswrapper[5200]: I1126 13:32:12.287309 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" event={"ID":"e061399a-a050-4f2e-bad0-7174a3b8a060","Type":"ContainerStarted","Data":"c0c7aab68703319f4138629e984728f9229287b09aa73ab86c21f2c5188e909b"} Nov 26 13:32:12 crc kubenswrapper[5200]: I1126 13:32:12.976418 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:12 crc kubenswrapper[5200]: I1126 13:32:12.976455 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:12 crc kubenswrapper[5200]: I1126 13:32:12.976590 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:12 crc kubenswrapper[5200]: E1126 13:32:12.976580 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:12 crc kubenswrapper[5200]: E1126 13:32:12.976873 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:12 crc kubenswrapper[5200]: E1126 13:32:12.976786 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:13 crc kubenswrapper[5200]: I1126 13:32:13.235776 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-vsms9" podStartSLOduration=11.235758552 podStartE2EDuration="11.235758552s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:12.341193344 +0000 UTC m=+101.020395182" watchObservedRunningTime="2025-11-26 13:32:13.235758552 +0000 UTC m=+101.914960370" Nov 26 13:32:13 crc kubenswrapper[5200]: I1126 13:32:13.236362 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-scmch"] Nov 26 13:32:13 crc kubenswrapper[5200]: I1126 13:32:13.236525 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:13 crc kubenswrapper[5200]: E1126 13:32:13.236634 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-scmch" podUID="aff96734-e4bb-4642-9207-9293cce04a16" Nov 26 13:32:13 crc kubenswrapper[5200]: I1126 13:32:13.295671 5200 generic.go:358] "Generic (PLEG): container finished" podID="1c236a7a-7975-4f6e-8a56-07203c1833ef" containerID="cf570739a91f3d11c80e06d379e8d65b862187569c3b71d6dec41c1ba37e083f" exitCode=0 Nov 26 13:32:13 crc kubenswrapper[5200]: I1126 13:32:13.297195 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerDied","Data":"cf570739a91f3d11c80e06d379e8d65b862187569c3b71d6dec41c1ba37e083f"} Nov 26 13:32:14 crc kubenswrapper[5200]: I1126 13:32:14.143929 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:32:14 crc kubenswrapper[5200]: I1126 13:32:14.307865 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" event={"ID":"1c236a7a-7975-4f6e-8a56-07203c1833ef","Type":"ContainerStarted","Data":"2a4c807e809cd9e70adaabf2b121fc70b7d5995f63b75e5d8a5dbd0b97e3a662"} Nov 26 13:32:14 crc kubenswrapper[5200]: I1126 13:32:14.333560 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4bsk9" podStartSLOduration=12.333533025 podStartE2EDuration="12.333533025s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:14.333084093 +0000 UTC m=+103.012285921" watchObservedRunningTime="2025-11-26 13:32:14.333533025 +0000 UTC m=+103.012734843" Nov 26 13:32:14 crc kubenswrapper[5200]: I1126 13:32:14.975012 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:14 crc kubenswrapper[5200]: I1126 13:32:14.975023 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:14 crc kubenswrapper[5200]: E1126 13:32:14.975202 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:14 crc kubenswrapper[5200]: I1126 13:32:14.975269 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:14 crc kubenswrapper[5200]: I1126 13:32:14.975512 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:14 crc kubenswrapper[5200]: E1126 13:32:14.975484 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:14 crc kubenswrapper[5200]: E1126 13:32:14.975635 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-scmch" podUID="aff96734-e4bb-4642-9207-9293cce04a16" Nov 26 13:32:14 crc kubenswrapper[5200]: E1126 13:32:14.975784 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:16 crc kubenswrapper[5200]: I1126 13:32:16.910132 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:16 crc kubenswrapper[5200]: E1126 13:32:16.910444 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.910394969 +0000 UTC m=+121.589596827 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:16 crc kubenswrapper[5200]: I1126 13:32:16.968510 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:16 crc kubenswrapper[5200]: I1126 13:32:16.968544 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:16 crc kubenswrapper[5200]: E1126 13:32:16.969228 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-scmch" podUID="aff96734-e4bb-4642-9207-9293cce04a16" Nov 26 13:32:16 crc kubenswrapper[5200]: I1126 13:32:16.968660 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:16 crc kubenswrapper[5200]: E1126 13:32:16.969347 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Nov 26 13:32:16 crc kubenswrapper[5200]: I1126 13:32:16.968700 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:16 crc kubenswrapper[5200]: E1126 13:32:16.969454 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Nov 26 13:32:16 crc kubenswrapper[5200]: E1126 13:32:16.969644 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Nov 26 13:32:17 crc kubenswrapper[5200]: I1126 13:32:17.011929 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:17 crc kubenswrapper[5200]: I1126 13:32:17.011995 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:17 crc kubenswrapper[5200]: I1126 13:32:17.012036 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012119 5200 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012189 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.012170657 +0000 UTC m=+121.691372465 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012259 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012308 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012325 5200 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012424 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.012396053 +0000 UTC m=+121.691598041 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:17 crc kubenswrapper[5200]: I1126 13:32:17.012479 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012618 5200 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012653 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.01264455 +0000 UTC m=+121.691846368 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012708 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012738 5200 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012749 5200 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:17 crc kubenswrapper[5200]: E1126 13:32:17.012786 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.012775944 +0000 UTC m=+121.691977922 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 13:32:17 crc kubenswrapper[5200]: I1126 13:32:17.490645 5200 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeReady" Nov 26 13:32:17 crc kubenswrapper[5200]: I1126 13:32:17.491330 5200 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Nov 26 13:32:17 crc kubenswrapper[5200]: I1126 13:32:17.542750 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-67c89758df-ksjj6"] Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.395773 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs"] Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.397085 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.401264 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.403632 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kl6m8\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.403655 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.403718 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.403844 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.420117 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.430029 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e435d47f-45c4-4a22-8748-86166a6737c1-config\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.430221 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d46b\" (UniqueName: \"kubernetes.io/projected/e435d47f-45c4-4a22-8748-86166a6737c1-kube-api-access-7d46b\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.430300 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e435d47f-45c4-4a22-8748-86166a6737c1-serving-cert\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.430348 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e435d47f-45c4-4a22-8748-86166a6737c1-trusted-ca\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.483510 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-h6ns5"] Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.483852 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.525621 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.525796 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.526146 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.526233 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"audit-1\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.526155 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.526534 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-client\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.526855 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.527133 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"serving-cert\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.527365 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-qqw4z\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531424 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d46b\" (UniqueName: \"kubernetes.io/projected/e435d47f-45c4-4a22-8748-86166a6737c1-kube-api-access-7d46b\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531472 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-audit-policies\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531529 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e435d47f-45c4-4a22-8748-86166a6737c1-serving-cert\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531563 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e435d47f-45c4-4a22-8748-86166a6737c1-config\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531635 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-etcd-client\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531661 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-etcd-serving-ca\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531691 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531756 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-audit-dir\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531781 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv69h\" (UniqueName: \"kubernetes.io/projected/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-kube-api-access-cv69h\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531807 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e435d47f-45c4-4a22-8748-86166a6737c1-trusted-ca\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531833 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-encryption-config\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.531867 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-serving-cert\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.535184 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e435d47f-45c4-4a22-8748-86166a6737c1-config\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.535846 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e435d47f-45c4-4a22-8748-86166a6737c1-trusted-ca\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.544139 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e435d47f-45c4-4a22-8748-86166a6737c1-serving-cert\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.565131 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d46b\" (UniqueName: \"kubernetes.io/projected/e435d47f-45c4-4a22-8748-86166a6737c1-kube-api-access-7d46b\") pod \"console-operator-67c89758df-ksjj6\" (UID: \"e435d47f-45c4-4a22-8748-86166a6737c1\") " pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.567953 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f"] Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.568372 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.568717 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.569103 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.569367 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.569549 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.571344 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.571797 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.572668 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.573025 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.573402 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.573541 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t8n29\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.574766 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"client-ca\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.575044 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.575293 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"serving-cert\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.576362 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-djmfg\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.576679 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"config\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.576786 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.582421 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-global-ca\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632658 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dl4\" (UniqueName: \"kubernetes.io/projected/6d7b6fb5-ca34-4fa3-a509-63527956edec-kube-api-access-d2dl4\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632743 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d7b6fb5-ca34-4fa3-a509-63527956edec-serving-cert\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632844 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-etcd-client\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632863 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-etcd-serving-ca\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632882 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632900 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632934 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-audit-dir\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632952 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv69h\" (UniqueName: \"kubernetes.io/projected/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-kube-api-access-cv69h\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.632973 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-client-ca\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.633010 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-encryption-config\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.633029 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d7b6fb5-ca34-4fa3-a509-63527956edec-tmp\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.633055 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-serving-cert\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.633092 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-config\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.633125 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-audit-policies\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.633993 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-audit-policies\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.634388 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-audit-dir\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.634960 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-etcd-serving-ca\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.635013 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.641338 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-serving-cert\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.641427 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-encryption-config\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.642161 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-etcd-client\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.655585 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv69h\" (UniqueName: \"kubernetes.io/projected/6a3a94bc-1154-4d4f-b1f2-061c4b252e8f-kube-api-access-cv69h\") pod \"apiserver-8596bd845d-bjpbs\" (UID: \"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.734402 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.734932 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2dl4\" (UniqueName: \"kubernetes.io/projected/6d7b6fb5-ca34-4fa3-a509-63527956edec-kube-api-access-d2dl4\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.735016 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d7b6fb5-ca34-4fa3-a509-63527956edec-serving-cert\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.735080 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.735121 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-client-ca\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.735878 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d7b6fb5-ca34-4fa3-a509-63527956edec-tmp\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.735961 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-config\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.736547 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d7b6fb5-ca34-4fa3-a509-63527956edec-tmp\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.737254 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.738221 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-config\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.738532 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-client-ca\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.746658 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d7b6fb5-ca34-4fa3-a509-63527956edec-serving-cert\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.757065 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2dl4\" (UniqueName: \"kubernetes.io/projected/6d7b6fb5-ca34-4fa3-a509-63527956edec-kube-api-access-d2dl4\") pod \"controller-manager-65b6cccf98-h6ns5\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.770067 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d44f6ddf-vvkzj"] Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.770245 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.776627 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.776666 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-config\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.779127 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"serving-cert\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.780859 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.780972 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-6tbpn\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.782756 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"service-ca-bundle\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.784711 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.837752 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qrh\" (UniqueName: \"kubernetes.io/projected/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-kube-api-access-r5qrh\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.837918 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.837971 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-serving-cert\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.838127 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-config\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.838234 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.841548 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-54qvp"] Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.841814 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.845568 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.847136 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.847429 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8dkm8\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.851272 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.851281 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.851357 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.851824 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.851979 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.858263 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.860717 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.892652 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-nmkr7"] Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.892928 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.896468 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"etcd-client\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.896781 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"config\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.898823 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"encryption-config-1\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.898962 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"audit-1\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.899403 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"etcd-serving-ca\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.899952 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"image-import-ca\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.901811 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.902463 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"kube-root-ca.crt\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.902560 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-4zqgh\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.903374 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"serving-cert\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.915445 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"trusted-ca-bundle\"" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.928060 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939498 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qrh\" (UniqueName: \"kubernetes.io/projected/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-kube-api-access-r5qrh\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939594 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-service-ca\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939626 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-etcd-client\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939662 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939691 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939734 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939764 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6346413-47ab-40fc-bf6b-9315694ebaaa-audit-dir\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939812 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-serving-cert\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939835 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-encryption-config\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939874 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-serving-cert\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939917 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-image-import-ca\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939974 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-serving-cert\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.939998 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-config\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.940028 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-config\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.940647 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhvvt\" (UniqueName: \"kubernetes.io/projected/a6346413-47ab-40fc-bf6b-9315694ebaaa-kube-api-access-vhvvt\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.940749 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-config\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.940919 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-oauth-config\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.940992 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh4f5\" (UniqueName: \"kubernetes.io/projected/5e4ba348-184c-436b-87ae-2fddc33c378e-kube-api-access-bh4f5\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.941051 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.941081 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-console-config\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.941110 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-oauth-serving-cert\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.941137 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6346413-47ab-40fc-bf6b-9315694ebaaa-node-pullsecrets\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.941193 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-audit\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.941220 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-trusted-ca-bundle\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.941832 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.942411 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.945114 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-serving-cert\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:18 crc kubenswrapper[5200]: I1126 13:32:18.961431 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qrh\" (UniqueName: \"kubernetes.io/projected/971db65f-e6c1-4d92-bbff-61ae1e6ec5e1-kube-api-access-r5qrh\") pod \"authentication-operator-7f5c659b84-5cj6f\" (UID: \"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.039101 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-54c688565-jr48p"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.039359 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042160 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-serving-cert\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042203 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-config\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042234 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhvvt\" (UniqueName: \"kubernetes.io/projected/a6346413-47ab-40fc-bf6b-9315694ebaaa-kube-api-access-vhvvt\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042261 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-oauth-config\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042285 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh4f5\" (UniqueName: \"kubernetes.io/projected/5e4ba348-184c-436b-87ae-2fddc33c378e-kube-api-access-bh4f5\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042313 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-console-config\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042337 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-oauth-serving-cert\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042361 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6346413-47ab-40fc-bf6b-9315694ebaaa-node-pullsecrets\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042392 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-audit\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042414 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-trusted-ca-bundle\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042480 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-service-ca\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042500 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-etcd-client\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042530 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042551 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042576 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6346413-47ab-40fc-bf6b-9315694ebaaa-audit-dir\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042600 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-serving-cert\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042627 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-encryption-config\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.042660 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-image-import-ca\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.044158 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-image-import-ca\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.044320 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6346413-47ab-40fc-bf6b-9315694ebaaa-audit-dir\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.045357 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-trusted-ca-bundle\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.045373 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-config\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.046337 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-console-config\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.046959 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-oauth-serving-cert\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.047033 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6346413-47ab-40fc-bf6b-9315694ebaaa-node-pullsecrets\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.047222 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.047610 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.047903 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-session\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.048504 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.048518 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-serving-cert\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.048803 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.048813 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.049292 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.049017 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.049131 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.049445 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"audit\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.049246 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.049167 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.050286 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-serving-cert\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.050361 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-service-ca\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.050555 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-d2bf2\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.051751 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.052325 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-etcd-client\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.052910 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.053030 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a6346413-47ab-40fc-bf6b-9315694ebaaa-encryption-config\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.058720 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-oauth-config\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.060936 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.065142 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a6346413-47ab-40fc-bf6b-9315694ebaaa-audit\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.066731 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.071457 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh4f5\" (UniqueName: \"kubernetes.io/projected/5e4ba348-184c-436b-87ae-2fddc33c378e-kube-api-access-bh4f5\") pod \"console-64d44f6ddf-vvkzj\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.075726 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhvvt\" (UniqueName: \"kubernetes.io/projected/a6346413-47ab-40fc-bf6b-9315694ebaaa-kube-api-access-vhvvt\") pod \"apiserver-9ddfb9f55-54qvp\" (UID: \"a6346413-47ab-40fc-bf6b-9315694ebaaa\") " pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.099425 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.143421 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144072 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-policies\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144136 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144187 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144226 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144280 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144406 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144449 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-dir\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144488 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144742 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144819 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144845 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8rpj\" (UniqueName: \"kubernetes.io/projected/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-kube-api-access-r8rpj\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144892 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.144976 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.168925 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.223363 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.246092 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.246157 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.246183 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-dir\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.246214 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.246249 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.246271 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.246292 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8rpj\" (UniqueName: \"kubernetes.io/projected/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-kube-api-access-r8rpj\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.247159 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.247788 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.249007 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.249101 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.249209 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-dir\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.250019 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.250115 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-policies\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.250187 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.250284 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.250327 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.250969 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.251196 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-policies\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.255660 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.256855 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.256940 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.260463 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.260985 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.261355 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.262801 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.269422 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.274329 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8rpj\" (UniqueName: \"kubernetes.io/projected/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-kube-api-access-r8rpj\") pod \"oauth-openshift-66458b6674-nmkr7\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.329162 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.329450 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.337578 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.337578 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.337905 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.337778 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.337990 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.337770 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-wzhvk\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.351432 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0ff30ccc-02d7-4687-921d-9f43931df261-machine-approver-tls\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.351595 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ff30ccc-02d7-4687-921d-9f43931df261-auth-proxy-config\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.351657 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff30ccc-02d7-4687-921d-9f43931df261-config\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.353631 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89t7k\" (UniqueName: \"kubernetes.io/projected/0ff30ccc-02d7-4687-921d-9f43931df261-kube-api-access-89t7k\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: W1126 13:32:19.363977 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4ba348_184c_436b_87ae_2fddc33c378e.slice/crio-58e07c8d22d8dc4a31c93d564458f59a50385cb0bf30fd83b401a15774d5b434 WatchSource:0}: Error finding container 58e07c8d22d8dc4a31c93d564458f59a50385cb0bf30fd83b401a15774d5b434: Status 404 returned error can't find the container with id 58e07c8d22d8dc4a31c93d564458f59a50385cb0bf30fd83b401a15774d5b434 Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.371627 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:19 crc kubenswrapper[5200]: W1126 13:32:19.422972 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6346413_47ab_40fc_bf6b_9315694ebaaa.slice/crio-89252443b9365b479cf7b27fa4b46ea3a7a1e73c8e524d54db1889ed486983f5 WatchSource:0}: Error finding container 89252443b9365b479cf7b27fa4b46ea3a7a1e73c8e524d54db1889ed486983f5: Status 404 returned error can't find the container with id 89252443b9365b479cf7b27fa4b46ea3a7a1e73c8e524d54db1889ed486983f5 Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.455068 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0ff30ccc-02d7-4687-921d-9f43931df261-machine-approver-tls\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.455131 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ff30ccc-02d7-4687-921d-9f43931df261-auth-proxy-config\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.455164 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff30ccc-02d7-4687-921d-9f43931df261-config\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.455221 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89t7k\" (UniqueName: \"kubernetes.io/projected/0ff30ccc-02d7-4687-921d-9f43931df261-kube-api-access-89t7k\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.456690 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff30ccc-02d7-4687-921d-9f43931df261-config\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.457461 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ff30ccc-02d7-4687-921d-9f43931df261-auth-proxy-config\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.462967 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0ff30ccc-02d7-4687-921d-9f43931df261-machine-approver-tls\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.463340 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.464276 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.467909 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.468520 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.468737 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jmhxf\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.469123 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.477739 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-g99bm"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.478003 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.483229 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.483480 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.483678 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-6c46w\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.484124 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.484375 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.487486 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89t7k\" (UniqueName: \"kubernetes.io/projected/0ff30ccc-02d7-4687-921d-9f43931df261-kube-api-access-89t7k\") pod \"machine-approver-54c688565-jr48p\" (UID: \"0ff30ccc-02d7-4687-921d-9f43931df261\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.497641 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.501178 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-rbac-proxy\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.501350 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-6n5ln\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.501890 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.502171 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.502344 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-tls\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.510437 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-images\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.518312 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-67c89758df-ksjj6" event={"ID":"e435d47f-45c4-4a22-8748-86166a6737c1","Type":"ContainerStarted","Data":"99d362d25644b9d75dacd3d3040c712a96c0f0402ef9726416802a2b163af3bc"} Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.519159 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.537757 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" event={"ID":"6d7b6fb5-ca34-4fa3-a509-63527956edec","Type":"ContainerStarted","Data":"5aacf521c649b8ca52a3834ee6790db199277207eeb15945a691a089b585de16"} Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.537824 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-krpxs"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.538387 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.541869 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.542565 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.542904 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"client-ca\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.546393 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"config\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.546784 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-mmcpt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.549738 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.549830 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-747b44746d-5sjbk"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.551623 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.554158 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"config-operator-serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.554355 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.555179 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-sjn6s\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.555426 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556168 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7363e7ee-68c6-4682-acf4-95c6ab68965a-config\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556215 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-config\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556243 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9gb\" (UniqueName: \"kubernetes.io/projected/a5a4136e-159f-4279-9632-b62ccf9824e1-kube-api-access-4x9gb\") pod \"cluster-samples-operator-6b564684c8-q2qxn\" (UID: \"a5a4136e-159f-4279-9632-b62ccf9824e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556281 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkh9k\" (UniqueName: \"kubernetes.io/projected/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-kube-api-access-fkh9k\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556303 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-images\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556375 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5a4136e-159f-4279-9632-b62ccf9824e1-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-q2qxn\" (UID: \"a5a4136e-159f-4279-9632-b62ccf9824e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556416 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7363e7ee-68c6-4682-acf4-95c6ab68965a-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556453 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82a64db7-bf3c-410b-9772-e830bfa83118-serving-cert\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556478 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzx5\" (UniqueName: \"kubernetes.io/projected/82a64db7-bf3c-410b-9772-e830bfa83118-kube-api-access-kzzx5\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556520 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-config\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556544 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556575 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82a64db7-bf3c-410b-9772-e830bfa83118-tmp\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556611 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqlbf\" (UniqueName: \"kubernetes.io/projected/7363e7ee-68c6-4682-acf4-95c6ab68965a-kube-api-access-vqlbf\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.556649 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-client-ca\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.571340 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.571617 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-747b44746d-5sjbk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.573736 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mdwwj\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.580155 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.580448 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.587259 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.587534 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-dockercfg-jcmfj\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.587880 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.588218 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.588921 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-config\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.601941 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.602244 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.608762 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-dockercfg-tnfx9\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.609081 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-config\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.610015 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.616355 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.616473 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-k5pvc"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.621381 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.621806 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-config\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.622316 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.622547 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-scheduler-operator-serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.622903 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-dockercfg-2wbn2\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.623046 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.623207 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.626763 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.627220 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.627390 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.629125 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.630405 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.633098 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.634572 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.635336 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.635958 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" event={"ID":"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f","Type":"ContainerStarted","Data":"e7c96b9b451edc94feec02432f6f66687f4f17ef92d2964a11d9f393e30e80b3"} Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.635989 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-g7wb6"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.637290 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2h6bs\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.643273 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.643878 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.644296 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.656012 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68cf44c8b8-cpmj7"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.657135 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.657263 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.659766 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.659787 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82a64db7-bf3c-410b-9772-e830bfa83118-tmp\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.659828 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58h9x\" (UniqueName: \"kubernetes.io/projected/b60d512c-29b7-4ac4-b63d-59a43f736b05-kube-api-access-58h9x\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.659866 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqlbf\" (UniqueName: \"kubernetes.io/projected/7363e7ee-68c6-4682-acf4-95c6ab68965a-kube-api-access-vqlbf\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.659897 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvcx\" (UniqueName: \"kubernetes.io/projected/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-kube-api-access-tfvcx\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.659921 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-available-featuregates\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.659958 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-client-ca\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.659999 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7363e7ee-68c6-4682-acf4-95c6ab68965a-config\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660021 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-config\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660041 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9gb\" (UniqueName: \"kubernetes.io/projected/a5a4136e-159f-4279-9632-b62ccf9824e1-kube-api-access-4x9gb\") pod \"cluster-samples-operator-6b564684c8-q2qxn\" (UID: \"a5a4136e-159f-4279-9632-b62ccf9824e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660059 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60d512c-29b7-4ac4-b63d-59a43f736b05-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660078 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b60d512c-29b7-4ac4-b63d-59a43f736b05-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660095 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5btfm\" (UniqueName: \"kubernetes.io/projected/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-kube-api-access-5btfm\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660122 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-tmp-dir\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660153 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkh9k\" (UniqueName: \"kubernetes.io/projected/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-kube-api-access-fkh9k\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660174 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-images\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660198 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5a4136e-159f-4279-9632-b62ccf9824e1-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-q2qxn\" (UID: \"a5a4136e-159f-4279-9632-b62ccf9824e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660227 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7363e7ee-68c6-4682-acf4-95c6ab68965a-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660253 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-serving-cert\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660274 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82a64db7-bf3c-410b-9772-e830bfa83118-serving-cert\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660298 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzx5\" (UniqueName: \"kubernetes.io/projected/82a64db7-bf3c-410b-9772-e830bfa83118-kube-api-access-kzzx5\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660347 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-config\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660370 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660401 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b60d512c-29b7-4ac4-b63d-59a43f736b05-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660433 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-metrics-tls\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.660893 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82a64db7-bf3c-410b-9772-e830bfa83118-tmp\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.661971 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7363e7ee-68c6-4682-acf4-95c6ab68965a-config\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.662399 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-client-ca\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.664008 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-config\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.673781 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7363e7ee-68c6-4682-acf4-95c6ab68965a-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.675073 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82a64db7-bf3c-410b-9772-e830bfa83118-serving-cert\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.676462 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.678311 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5a4136e-159f-4279-9632-b62ccf9824e1-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-q2qxn\" (UID: \"a5a4136e-159f-4279-9632-b62ccf9824e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.678353 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.678993 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.679584 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.680443 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-images\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.680866 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-config\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.688038 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.688734 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.699937 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-67c89758df-ksjj6"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.699976 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.699993 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.700111 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.712334 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-nmkr7"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.712417 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-h6ns5"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.712437 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.713791 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.718912 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.736437 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.736830 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.737272 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.747913 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.747987 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.757477 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"ingress-operator-dockercfg-74nwh\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.764559 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rnv6n"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.769412 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-metrics-tls\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.769553 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58h9x\" (UniqueName: \"kubernetes.io/projected/b60d512c-29b7-4ac4-b63d-59a43f736b05-kube-api-access-58h9x\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.769586 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvcx\" (UniqueName: \"kubernetes.io/projected/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-kube-api-access-tfvcx\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.769627 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-available-featuregates\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.769705 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60d512c-29b7-4ac4-b63d-59a43f736b05-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.769921 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b60d512c-29b7-4ac4-b63d-59a43f736b05-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.769995 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5btfm\" (UniqueName: \"kubernetes.io/projected/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-kube-api-access-5btfm\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.770064 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-tmp-dir\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.770212 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-serving-cert\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.770344 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b60d512c-29b7-4ac4-b63d-59a43f736b05-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.770536 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-available-featuregates\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.770927 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-tmp-dir\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.775913 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.776260 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.776857 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.780633 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"metrics-tls\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.783400 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-serving-cert\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.799015 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60d512c-29b7-4ac4-b63d-59a43f736b05-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.805515 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.806061 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.806990 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"trusted-ca\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.814962 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b60d512c-29b7-4ac4-b63d-59a43f736b05-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.818390 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"dns-operator-dockercfg-wbbsn\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.823934 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.824179 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.837187 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.846074 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-lmbx9"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.847031 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.857421 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"metrics-tls\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.865244 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-metrics-tls\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.868006 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.868219 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.874722 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-gj489"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.874945 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.876608 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.881782 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-74545575db-76hbn"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.882749 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.899917 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.900151 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.914701 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.915089 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.916922 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-serving-cert\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.925924 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lpc6x"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.926191 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.938675 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-config\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.939905 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ctstk"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.940221 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.957221 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-client\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976233 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976306 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-krpxs"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976321 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976338 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976352 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976366 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-54qvp"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976378 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976390 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-g99bm"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976403 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-747b44746d-5sjbk"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976415 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976427 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976484 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d44f6ddf-vvkzj"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976499 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976510 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.976523 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-twnv2"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.977656 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-dockercfg-4vdnc\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.996109 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-service-ca-bundle\"" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.997851 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.997897 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.997912 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.997927 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.997941 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.997956 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ctstk"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.997971 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-gj489"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.998018 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.998033 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.998045 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-k5pvc"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.998059 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-g7wb6"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.998074 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.998090 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ksh5l"] Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.998227 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:19 crc kubenswrapper[5200]: I1126 13:32:19.998837 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014014 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014063 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014078 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014095 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ksh5l"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014106 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014118 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lpc6x"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014129 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-74545575db-76hbn"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014141 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-lmbx9"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014152 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014250 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ksh5l" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014394 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014414 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-67c89758df-ksjj6"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014428 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-h6ns5"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014440 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014454 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d44f6ddf-vvkzj"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014469 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-54qvp"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.014513 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-nmkr7"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.017140 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.037439 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-ca-bundle\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.057336 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.094406 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqlbf\" (UniqueName: \"kubernetes.io/projected/7363e7ee-68c6-4682-acf4-95c6ab68965a-kube-api-access-vqlbf\") pod \"openshift-apiserver-operator-846cbfc458-kxjqs\" (UID: \"7363e7ee-68c6-4682-acf4-95c6ab68965a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.115206 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzx5\" (UniqueName: \"kubernetes.io/projected/82a64db7-bf3c-410b-9772-e830bfa83118-kube-api-access-kzzx5\") pod \"route-controller-manager-776cdc94d6-685h9\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.135060 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkh9k\" (UniqueName: \"kubernetes.io/projected/4dc74e00-faf0-4e1f-9b82-4a03c34a8928-kube-api-access-fkh9k\") pod \"machine-api-operator-755bb95488-g99bm\" (UID: \"4dc74e00-faf0-4e1f-9b82-4a03c34a8928\") " pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.139764 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.156718 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.157598 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9gb\" (UniqueName: \"kubernetes.io/projected/a5a4136e-159f-4279-9632-b62ccf9824e1-kube-api-access-4x9gb\") pod \"cluster-samples-operator-6b564684c8-q2qxn\" (UID: \"a5a4136e-159f-4279-9632-b62ccf9824e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.177998 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.178407 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.186334 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aff96734-e4bb-4642-9207-9293cce04a16-metrics-certs\") pod \"network-metrics-daemon-scmch\" (UID: \"aff96734-e4bb-4642-9207-9293cce04a16\") " pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.197662 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.218090 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.236548 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.257333 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kw8fx\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.275171 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.276873 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-certs-default\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.298421 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-ntnd7\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.317349 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.317635 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-operator-tls\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.337667 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.345491 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.362287 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-dockercfg-bf7fj\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.373906 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" event={"ID":"0ff30ccc-02d7-4687-921d-9f43931df261","Type":"ContainerStarted","Data":"d5587e11ae490374945fc25f6a514575e4d39cc6a18757f7e9c463ad188d7ec3"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.373977 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" event={"ID":"0ff30ccc-02d7-4687-921d-9f43931df261","Type":"ContainerStarted","Data":"137cf91d463fdffaf5de810301a3850b1b8546f048a297f304a33696b354a348"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.375010 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" event={"ID":"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e","Type":"ContainerStarted","Data":"03d06edce6a485eae1ef49e15d55331d857733adf09ef67bf19d1a4b681ec607"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.375039 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" event={"ID":"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e","Type":"ContainerStarted","Data":"8c121d22ba660cc2a9582708fa612b4d32d21312fa5b784af1ac470ca8897f02"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.375353 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.376235 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-67c89758df-ksjj6" event={"ID":"e435d47f-45c4-4a22-8748-86166a6737c1","Type":"ContainerStarted","Data":"be08399812a79b8f1614887f26b670e2cd2d022c799cd7068966d605f0b63cb9"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.377559 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.378264 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-vvkzj" event={"ID":"5e4ba348-184c-436b-87ae-2fddc33c378e","Type":"ContainerStarted","Data":"2e649cf23852b660db8e678b7652ebc64a43dad6999b8449fb5aad0703e2bbe5"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.378294 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-vvkzj" event={"ID":"5e4ba348-184c-436b-87ae-2fddc33c378e","Type":"ContainerStarted","Data":"58e07c8d22d8dc4a31c93d564458f59a50385cb0bf30fd83b401a15774d5b434"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.379588 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-serving-cert\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.384892 5200 patch_prober.go:28] interesting pod/console-operator-67c89758df-ksjj6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.384967 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-67c89758df-ksjj6" podUID="e435d47f-45c4-4a22-8748-86166a6737c1" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.385013 5200 patch_prober.go:28] interesting pod/oauth-openshift-66458b6674-nmkr7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.385174 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" podUID="50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.388861 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" event={"ID":"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1","Type":"ContainerStarted","Data":"cd6c5d954dadfca9126b812fd016638e742e3c2f4499ae16f6e89fa0b9883dbd"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.388906 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" event={"ID":"971db65f-e6c1-4d92-bbff-61ae1e6ec5e1","Type":"ContainerStarted","Data":"9c2dfe3ed58b8d7c54dab3009c61b7597193fc59bd61ef037692532170e0a83d"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.390743 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" event={"ID":"6d7b6fb5-ca34-4fa3-a509-63527956edec","Type":"ContainerStarted","Data":"0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.392630 5200 generic.go:358] "Generic (PLEG): container finished" podID="6a3a94bc-1154-4d4f-b1f2-061c4b252e8f" containerID="99a1607217445e51e923def741673b5a50194d7c7a2e97a579b5b9261cdf8498" exitCode=0 Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.392684 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" event={"ID":"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f","Type":"ContainerDied","Data":"99a1607217445e51e923def741673b5a50194d7c7a2e97a579b5b9261cdf8498"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.395325 5200 generic.go:358] "Generic (PLEG): container finished" podID="a6346413-47ab-40fc-bf6b-9315694ebaaa" containerID="04d97eb530c928213b99129534aefbf5986c70b1c045e4ee37cd3b68834312d0" exitCode=0 Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.395380 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" event={"ID":"a6346413-47ab-40fc-bf6b-9315694ebaaa","Type":"ContainerDied","Data":"04d97eb530c928213b99129534aefbf5986c70b1c045e4ee37cd3b68834312d0"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.395396 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" event={"ID":"a6346413-47ab-40fc-bf6b-9315694ebaaa","Type":"ContainerStarted","Data":"89252443b9365b479cf7b27fa4b46ea3a7a1e73c8e524d54db1889ed486983f5"} Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.398617 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-config\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.403804 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.424298 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-scmch" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.424900 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.435949 5200 patch_prober.go:28] interesting pod/controller-manager-65b6cccf98-h6ns5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.436026 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" podUID="6d7b6fb5-ca34-4fa3-a509-63527956edec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.445252 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.461825 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kknhg\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.477740 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.497319 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-images\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.516495 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mco-proxy-tls\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.524626 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-g99bm"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.564082 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-dockercfg-sw6nc\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.565676 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-tls\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.578250 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-dockercfg-gnx66\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.592971 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.637063 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58h9x\" (UniqueName: \"kubernetes.io/projected/b60d512c-29b7-4ac4-b63d-59a43f736b05-kube-api-access-58h9x\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.649270 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvcx\" (UniqueName: \"kubernetes.io/projected/6d2ce281-016f-4cb6-b51f-5fbdcd007f99-kube-api-access-tfvcx\") pod \"dns-operator-799b87ffcd-g7wb6\" (UID: \"6d2ce281-016f-4cb6-b51f-5fbdcd007f99\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.670346 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5btfm\" (UniqueName: \"kubernetes.io/projected/260a7f9c-0a16-4a9c-95f9-84ddef0bbf96-kube-api-access-5btfm\") pod \"openshift-config-operator-5777786469-krpxs\" (UID: \"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96\") " pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.677129 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"openshift-service-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.677408 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b60d512c-29b7-4ac4-b63d-59a43f736b05-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qmpmk\" (UID: \"b60d512c-29b7-4ac4-b63d-59a43f736b05\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.697487 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serviceaccount-dockercfg-4gqzj\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.717756 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"package-server-manager-serving-cert\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.740807 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.742014 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.743400 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.757263 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-sysctl-allowlist\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.781668 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"catalog-operator-serving-cert\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.797242 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"pprof-cert\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.802460 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.820632 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.829266 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-scmch"] Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.834873 5200 request.go:752] "Waited before sending request" delay="1.010234784s" reason="client-side throttling, not priority and fairness" verb="GET" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcollect-profiles-dockercfg-vfqp6&limit=500&resourceVersion=0" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.837058 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.856738 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mcc-proxy-tls\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.877273 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-controller-dockercfg-xnj77\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.898718 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-admission-controller-secret\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.920392 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ac-dockercfg-gj7jx\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.935946 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.938460 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serving-cert\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.958442 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.977300 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-dockercfg-2cfkp\"" Nov 26 13:32:20 crc kubenswrapper[5200]: I1126 13:32:20.999415 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-metrics\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.022626 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.024049 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-g7wb6"] Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.046097 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.077615 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.083943 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk"] Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.096859 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.101258 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16667404-8d60-4bed-bd9e-e8e14dae10a0-config\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.101466 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-tls\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.101511 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/facd7487-680a-45aa-b205-ea98d33f5ac6-config\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.101540 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-certificates\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.101571 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-ca-trust-extracted\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.101719 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.101854 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16667404-8d60-4bed-bd9e-e8e14dae10a0-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103031 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103077 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c5c2833-7cd0-4393-8af3-38aecce665c5-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103117 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-trusted-ca\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103134 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16667404-8d60-4bed-bd9e-e8e14dae10a0-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103201 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103227 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5c2833-7cd0-4393-8af3-38aecce665c5-config\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103246 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-config\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103315 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-bound-sa-token\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103353 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd82c\" (UniqueName: \"kubernetes.io/projected/6c5c2833-7cd0-4393-8af3-38aecce665c5-kube-api-access-xd82c\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103378 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/facd7487-680a-45aa-b205-ea98d33f5ac6-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103396 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg727\" (UniqueName: \"kubernetes.io/projected/facd7487-680a-45aa-b205-ea98d33f5ac6-kube-api-access-jg727\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103414 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103437 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hv5v\" (UniqueName: \"kubernetes.io/projected/1201bf36-1a87-438f-9140-a4c310e347ee-kube-api-access-4hv5v\") pod \"downloads-747b44746d-5sjbk\" (UID: \"1201bf36-1a87-438f-9140-a4c310e347ee\") " pod="openshift-console/downloads-747b44746d-5sjbk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103488 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c5c2833-7cd0-4393-8af3-38aecce665c5-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103503 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16667404-8d60-4bed-bd9e-e8e14dae10a0-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.103581 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2zj\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-kube-api-access-7h2zj\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.107950 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-installation-pull-secrets\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.108201 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:21.608179643 +0000 UTC m=+110.287381461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.116957 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bgxvm\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.138879 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.157288 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.179741 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.196814 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.205281 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-krpxs"] Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.210885 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.211366 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:21.711328189 +0000 UTC m=+110.390530007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.211478 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c953c5-c14d-4655-b632-0fbfa822316c-serving-cert\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.211524 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-default-certificate\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.211851 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06c119f7-e161-458b-9962-5b3593bfffda-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.211893 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/819df023-cb2d-4b15-a117-82de9af73281-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-mvlbc\" (UID: \"819df023-cb2d-4b15-a117-82de9af73281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.211928 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-srv-cert\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.211961 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/160d9bf6-cd99-4000-b1b1-ea717f855b81-webhook-certs\") pod \"multus-admission-controller-69db94689b-lmbx9\" (UID: \"160d9bf6-cd99-4000-b1b1-ea717f855b81\") " pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212004 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-plugins-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212061 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-socket-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212101 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212139 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftqd2\" (UniqueName: \"kubernetes.io/projected/db9aeb0f-11fc-4e41-9ab0-c2a09854fc67-kube-api-access-ftqd2\") pod \"migrator-866fcbc849-q77rr\" (UID: \"db9aeb0f-11fc-4e41-9ab0-c2a09854fc67\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212176 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac4ee1c6-2791-40d8-a167-611e3a8b855f-config\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212207 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-config\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212240 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq6n6\" (UniqueName: \"kubernetes.io/projected/f9c953c5-c14d-4655-b632-0fbfa822316c-kube-api-access-tq6n6\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212287 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gx7\" (UniqueName: \"kubernetes.io/projected/0583aa03-5210-45af-a727-9dd88a965f3b-kube-api-access-j4gx7\") pod \"control-plane-machine-set-operator-75ffdb6fcd-jkbqq\" (UID: \"0583aa03-5210-45af-a727-9dd88a965f3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212317 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-certs\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212355 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c5c2833-7cd0-4393-8af3-38aecce665c5-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212389 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-trusted-ca\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212422 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/626994d9-9c2a-4eb1-8383-090d3e64b27d-tmp-dir\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212452 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee8b88c7-3cb3-495a-a95f-458d98655950-tmp\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212513 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wpt4\" (UniqueName: \"kubernetes.io/projected/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-kube-api-access-5wpt4\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212572 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-service-ca\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212599 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z95mh\" (UniqueName: \"kubernetes.io/projected/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-kube-api-access-z95mh\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212630 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjnq\" (UniqueName: \"kubernetes.io/projected/05227425-db6e-47c1-98a1-a379abd65dbd-kube-api-access-ppjnq\") pod \"ingress-canary-ksh5l\" (UID: \"05227425-db6e-47c1-98a1-a379abd65dbd\") " pod="openshift-ingress-canary/ingress-canary-ksh5l" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212663 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212717 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5ppc\" (UniqueName: \"kubernetes.io/projected/5022e6ca-d788-42fc-80ee-c4b3458b22f2-kube-api-access-h5ppc\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212781 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee8b88c7-3cb3-495a-a95f-458d98655950-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212868 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg727\" (UniqueName: \"kubernetes.io/projected/facd7487-680a-45aa-b205-ea98d33f5ac6-kube-api-access-jg727\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212902 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-node-bootstrap-token\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.212975 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-bound-sa-token\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213007 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-apiservice-cert\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213042 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e175f1f4-4647-4ca1-aecf-baab5eab2da1-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213097 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzqd6\" (UniqueName: \"kubernetes.io/projected/819df023-cb2d-4b15-a117-82de9af73281-kube-api-access-wzqd6\") pod \"package-server-manager-77f986bd66-mvlbc\" (UID: \"819df023-cb2d-4b15-a117-82de9af73281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213161 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hv5v\" (UniqueName: \"kubernetes.io/projected/1201bf36-1a87-438f-9140-a4c310e347ee-kube-api-access-4hv5v\") pod \"downloads-747b44746d-5sjbk\" (UID: \"1201bf36-1a87-438f-9140-a4c310e347ee\") " pod="openshift-console/downloads-747b44746d-5sjbk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213261 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626994d9-9c2a-4eb1-8383-090d3e64b27d-config\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213349 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac4ee1c6-2791-40d8-a167-611e3a8b855f-serving-cert\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213384 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-config-volume\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213411 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-stats-auth\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213449 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qr67\" (UniqueName: \"kubernetes.io/projected/160d9bf6-cd99-4000-b1b1-ea717f855b81-kube-api-access-9qr67\") pod \"multus-admission-controller-69db94689b-lmbx9\" (UID: \"160d9bf6-cd99-4000-b1b1-ea717f855b81\") " pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213495 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c5c2833-7cd0-4393-8af3-38aecce665c5-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213528 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16667404-8d60-4bed-bd9e-e8e14dae10a0-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213563 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213632 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2zj\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-kube-api-access-7h2zj\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213667 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-client\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213727 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-csi-data-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213785 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-installation-pull-secrets\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213821 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05227425-db6e-47c1-98a1-a379abd65dbd-cert\") pod \"ingress-canary-ksh5l\" (UID: \"05227425-db6e-47c1-98a1-a379abd65dbd\") " pod="openshift-ingress-canary/ingress-canary-ksh5l" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213860 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm468\" (UniqueName: \"kubernetes.io/projected/e175f1f4-4647-4ca1-aecf-baab5eab2da1-kube-api-access-mm468\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213896 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5022e6ca-d788-42fc-80ee-c4b3458b22f2-service-ca-bundle\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.213997 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16667404-8d60-4bed-bd9e-e8e14dae10a0-config\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214040 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ead8b44-73b3-4f27-8b79-65548b623235-metrics-tls\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214128 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ead8b44-73b3-4f27-8b79-65548b623235-config-volume\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214169 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/facd7487-680a-45aa-b205-ea98d33f5ac6-config\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214263 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-ca-trust-extracted\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214293 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lkqc\" (UniqueName: \"kubernetes.io/projected/7ead8b44-73b3-4f27-8b79-65548b623235-kube-api-access-5lkqc\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214370 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdznc\" (UniqueName: \"kubernetes.io/projected/ee8b88c7-3cb3-495a-a95f-458d98655950-kube-api-access-fdznc\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214462 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214504 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-secret-volume\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214539 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-signing-cabundle\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214601 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktwg\" (UniqueName: \"kubernetes.io/projected/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-kube-api-access-mktwg\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214658 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16667404-8d60-4bed-bd9e-e8e14dae10a0-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214767 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fhp8\" (UniqueName: \"kubernetes.io/projected/ac4ee1c6-2791-40d8-a167-611e3a8b855f-kube-api-access-9fhp8\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214823 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdx6\" (UniqueName: \"kubernetes.io/projected/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-kube-api-access-hbdx6\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214880 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/626994d9-9c2a-4eb1-8383-090d3e64b27d-kube-api-access\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214917 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-registration-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.214988 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215052 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-mountpoint-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215096 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215133 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jjrn\" (UniqueName: \"kubernetes.io/projected/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-kube-api-access-6jjrn\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215198 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16667404-8d60-4bed-bd9e-e8e14dae10a0-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215253 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/06c119f7-e161-458b-9962-5b3593bfffda-images\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215294 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/ee8b88c7-3cb3-495a-a95f-458d98655950-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215358 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-webhook-cert\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215402 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215449 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/626994d9-9c2a-4eb1-8383-090d3e64b27d-serving-cert\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215487 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f3b86d85-2bcb-426a-b891-b3d880ac7991-tmpfs\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215555 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5c2833-7cd0-4393-8af3-38aecce665c5-config\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215592 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-config\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215620 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9c953c5-c14d-4655-b632-0fbfa822316c-tmp-dir\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215713 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06c119f7-e161-458b-9962-5b3593bfffda-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215825 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/facd7487-680a-45aa-b205-ea98d33f5ac6-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215864 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215926 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-ready\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.215956 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95nzq\" (UniqueName: \"kubernetes.io/projected/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-kube-api-access-95nzq\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216044 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd82c\" (UniqueName: \"kubernetes.io/projected/6c5c2833-7cd0-4393-8af3-38aecce665c5-kube-api-access-xd82c\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216075 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e175f1f4-4647-4ca1-aecf-baab5eab2da1-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216103 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-tmpfs\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216142 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-ca\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216198 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slrfd\" (UniqueName: \"kubernetes.io/projected/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-kube-api-access-slrfd\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216227 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-tmp\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216259 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4v28\" (UniqueName: \"kubernetes.io/projected/f3b86d85-2bcb-426a-b891-b3d880ac7991-kube-api-access-n4v28\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216294 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0583aa03-5210-45af-a727-9dd88a965f3b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-jkbqq\" (UID: \"0583aa03-5210-45af-a727-9dd88a965f3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216329 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrxh\" (UniqueName: \"kubernetes.io/projected/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-kube-api-access-7zrxh\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216387 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ead8b44-73b3-4f27-8b79-65548b623235-tmp-dir\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216427 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee8b88c7-3cb3-495a-a95f-458d98655950-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216482 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3b86d85-2bcb-426a-b891-b3d880ac7991-srv-cert\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216518 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216551 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-tmpfs\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.216579 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-metrics-certs\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.217165 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-signing-key\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.217297 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-tls\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.217344 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-certificates\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.217379 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee8b88c7-3cb3-495a-a95f-458d98655950-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.217461 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3b86d85-2bcb-426a-b891-b3d880ac7991-profile-collector-cert\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.217500 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5pk\" (UniqueName: \"kubernetes.io/projected/06c119f7-e161-458b-9962-5b3593bfffda-kube-api-access-fw5pk\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.222727 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.226424 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjqfd\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.226806 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-trusted-ca\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.229619 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16667404-8d60-4bed-bd9e-e8e14dae10a0-config\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.231200 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c5c2833-7cd0-4393-8af3-38aecce665c5-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.232962 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16667404-8d60-4bed-bd9e-e8e14dae10a0-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.233508 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-config\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.236076 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-certificates\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.237314 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.237390 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/facd7487-680a-45aa-b205-ea98d33f5ac6-config\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.237751 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:21.737718477 +0000 UTC m=+110.416920295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.239070 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-ca-trust-extracted\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.240582 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5c2833-7cd0-4393-8af3-38aecce665c5-config\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.247886 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-tls\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.252781 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/facd7487-680a-45aa-b205-ea98d33f5ac6-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.257613 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.258308 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16667404-8d60-4bed-bd9e-e8e14dae10a0-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.258678 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-installation-pull-secrets\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.259899 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.272238 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c5c2833-7cd0-4393-8af3-38aecce665c5-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.277852 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.299737 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.317299 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kpvmz\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.318295 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.318638 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:21.818578039 +0000 UTC m=+110.497779857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.318789 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-signing-key\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.318851 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee8b88c7-3cb3-495a-a95f-458d98655950-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.318899 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3b86d85-2bcb-426a-b891-b3d880ac7991-profile-collector-cert\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.318919 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5pk\" (UniqueName: \"kubernetes.io/projected/06c119f7-e161-458b-9962-5b3593bfffda-kube-api-access-fw5pk\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.318944 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c953c5-c14d-4655-b632-0fbfa822316c-serving-cert\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319232 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-default-certificate\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319266 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06c119f7-e161-458b-9962-5b3593bfffda-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319294 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/819df023-cb2d-4b15-a117-82de9af73281-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-mvlbc\" (UID: \"819df023-cb2d-4b15-a117-82de9af73281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319319 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-srv-cert\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319340 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/160d9bf6-cd99-4000-b1b1-ea717f855b81-webhook-certs\") pod \"multus-admission-controller-69db94689b-lmbx9\" (UID: \"160d9bf6-cd99-4000-b1b1-ea717f855b81\") " pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319365 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-plugins-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319385 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-socket-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319409 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftqd2\" (UniqueName: \"kubernetes.io/projected/db9aeb0f-11fc-4e41-9ab0-c2a09854fc67-kube-api-access-ftqd2\") pod \"migrator-866fcbc849-q77rr\" (UID: \"db9aeb0f-11fc-4e41-9ab0-c2a09854fc67\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319427 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac4ee1c6-2791-40d8-a167-611e3a8b855f-config\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319446 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-config\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319463 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq6n6\" (UniqueName: \"kubernetes.io/projected/f9c953c5-c14d-4655-b632-0fbfa822316c-kube-api-access-tq6n6\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319489 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4gx7\" (UniqueName: \"kubernetes.io/projected/0583aa03-5210-45af-a727-9dd88a965f3b-kube-api-access-j4gx7\") pod \"control-plane-machine-set-operator-75ffdb6fcd-jkbqq\" (UID: \"0583aa03-5210-45af-a727-9dd88a965f3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319517 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-certs\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319543 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/626994d9-9c2a-4eb1-8383-090d3e64b27d-tmp-dir\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319558 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee8b88c7-3cb3-495a-a95f-458d98655950-tmp\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319578 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wpt4\" (UniqueName: \"kubernetes.io/projected/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-kube-api-access-5wpt4\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319599 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-service-ca\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319617 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z95mh\" (UniqueName: \"kubernetes.io/projected/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-kube-api-access-z95mh\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319675 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjnq\" (UniqueName: \"kubernetes.io/projected/05227425-db6e-47c1-98a1-a379abd65dbd-kube-api-access-ppjnq\") pod \"ingress-canary-ksh5l\" (UID: \"05227425-db6e-47c1-98a1-a379abd65dbd\") " pod="openshift-ingress-canary/ingress-canary-ksh5l" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319710 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319729 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5ppc\" (UniqueName: \"kubernetes.io/projected/5022e6ca-d788-42fc-80ee-c4b3458b22f2-kube-api-access-h5ppc\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319752 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee8b88c7-3cb3-495a-a95f-458d98655950-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319777 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-node-bootstrap-token\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319798 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-apiservice-cert\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319818 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e175f1f4-4647-4ca1-aecf-baab5eab2da1-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319840 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzqd6\" (UniqueName: \"kubernetes.io/projected/819df023-cb2d-4b15-a117-82de9af73281-kube-api-access-wzqd6\") pod \"package-server-manager-77f986bd66-mvlbc\" (UID: \"819df023-cb2d-4b15-a117-82de9af73281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319864 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626994d9-9c2a-4eb1-8383-090d3e64b27d-config\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319886 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac4ee1c6-2791-40d8-a167-611e3a8b855f-serving-cert\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319904 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-config-volume\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.319922 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-stats-auth\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320051 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qr67\" (UniqueName: \"kubernetes.io/projected/160d9bf6-cd99-4000-b1b1-ea717f855b81-kube-api-access-9qr67\") pod \"multus-admission-controller-69db94689b-lmbx9\" (UID: \"160d9bf6-cd99-4000-b1b1-ea717f855b81\") " pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320080 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320108 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-client\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320126 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-csi-data-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320145 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05227425-db6e-47c1-98a1-a379abd65dbd-cert\") pod \"ingress-canary-ksh5l\" (UID: \"05227425-db6e-47c1-98a1-a379abd65dbd\") " pod="openshift-ingress-canary/ingress-canary-ksh5l" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320163 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm468\" (UniqueName: \"kubernetes.io/projected/e175f1f4-4647-4ca1-aecf-baab5eab2da1-kube-api-access-mm468\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320180 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5022e6ca-d788-42fc-80ee-c4b3458b22f2-service-ca-bundle\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320203 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ead8b44-73b3-4f27-8b79-65548b623235-metrics-tls\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320231 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ead8b44-73b3-4f27-8b79-65548b623235-config-volume\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320260 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lkqc\" (UniqueName: \"kubernetes.io/projected/7ead8b44-73b3-4f27-8b79-65548b623235-kube-api-access-5lkqc\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320295 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdznc\" (UniqueName: \"kubernetes.io/projected/ee8b88c7-3cb3-495a-a95f-458d98655950-kube-api-access-fdznc\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320319 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-secret-volume\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320337 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-signing-cabundle\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320357 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mktwg\" (UniqueName: \"kubernetes.io/projected/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-kube-api-access-mktwg\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320378 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fhp8\" (UniqueName: \"kubernetes.io/projected/ac4ee1c6-2791-40d8-a167-611e3a8b855f-kube-api-access-9fhp8\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320395 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdx6\" (UniqueName: \"kubernetes.io/projected/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-kube-api-access-hbdx6\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320415 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/626994d9-9c2a-4eb1-8383-090d3e64b27d-kube-api-access\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320432 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-registration-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320455 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320472 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-mountpoint-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320489 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320512 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jjrn\" (UniqueName: \"kubernetes.io/projected/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-kube-api-access-6jjrn\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320532 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/06c119f7-e161-458b-9962-5b3593bfffda-images\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320550 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/ee8b88c7-3cb3-495a-a95f-458d98655950-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320572 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-webhook-cert\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320596 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320633 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/626994d9-9c2a-4eb1-8383-090d3e64b27d-serving-cert\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320649 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f3b86d85-2bcb-426a-b891-b3d880ac7991-tmpfs\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320670 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9c953c5-c14d-4655-b632-0fbfa822316c-tmp-dir\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320708 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06c119f7-e161-458b-9962-5b3593bfffda-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320736 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-ready\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320755 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95nzq\" (UniqueName: \"kubernetes.io/projected/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-kube-api-access-95nzq\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320780 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e175f1f4-4647-4ca1-aecf-baab5eab2da1-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320795 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-tmpfs\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320816 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-ca\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320835 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slrfd\" (UniqueName: \"kubernetes.io/projected/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-kube-api-access-slrfd\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320854 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-tmp\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320875 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4v28\" (UniqueName: \"kubernetes.io/projected/f3b86d85-2bcb-426a-b891-b3d880ac7991-kube-api-access-n4v28\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320894 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0583aa03-5210-45af-a727-9dd88a965f3b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-jkbqq\" (UID: \"0583aa03-5210-45af-a727-9dd88a965f3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320912 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrxh\" (UniqueName: \"kubernetes.io/projected/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-kube-api-access-7zrxh\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320932 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ead8b44-73b3-4f27-8b79-65548b623235-tmp-dir\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320951 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee8b88c7-3cb3-495a-a95f-458d98655950-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320979 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3b86d85-2bcb-426a-b891-b3d880ac7991-srv-cert\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.320997 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.321015 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-tmpfs\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.321032 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-metrics-certs\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.325761 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5022e6ca-d788-42fc-80ee-c4b3458b22f2-service-ca-bundle\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.326481 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-service-ca\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.327757 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee8b88c7-3cb3-495a-a95f-458d98655950-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.330429 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/626994d9-9c2a-4eb1-8383-090d3e64b27d-tmp-dir\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.330579 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06c119f7-e161-458b-9962-5b3593bfffda-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.330838 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.331191 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-plugins-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.331189 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee8b88c7-3cb3-495a-a95f-458d98655950-tmp\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.331231 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-signing-key\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.331273 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-socket-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.331811 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ead8b44-73b3-4f27-8b79-65548b623235-config-volume\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.332799 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c953c5-c14d-4655-b632-0fbfa822316c-serving-cert\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.335080 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-tmpfs\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.335632 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-ca\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.335807 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c953c5-c14d-4655-b632-0fbfa822316c-config\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.336176 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/626994d9-9c2a-4eb1-8383-090d3e64b27d-serving-cert\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.336339 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-default-certificate\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.337648 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/06c119f7-e161-458b-9962-5b3593bfffda-images\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.337782 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee8b88c7-3cb3-495a-a95f-458d98655950-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.337915 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-srv-cert\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.338205 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-registration-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.338263 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-mountpoint-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.338280 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.338549 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-tmp\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.339042 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f9c953c5-c14d-4655-b632-0fbfa822316c-tmp-dir\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.339244 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.339852 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-tmpfs\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.339902 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-metrics-certs\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.340747 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-csi-data-dir\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.340937 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-ready\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.341921 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e175f1f4-4647-4ca1-aecf-baab5eab2da1-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.342914 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06c119f7-e161-458b-9962-5b3593bfffda-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.343025 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:21.843006063 +0000 UTC m=+110.522207881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.343453 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e175f1f4-4647-4ca1-aecf-baab5eab2da1-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.343795 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/ee8b88c7-3cb3-495a-a95f-458d98655950-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.343857 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626994d9-9c2a-4eb1-8383-090d3e64b27d-config\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.344136 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ead8b44-73b3-4f27-8b79-65548b623235-tmp-dir\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.345095 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f3b86d85-2bcb-426a-b891-b3d880ac7991-tmpfs\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.345110 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f3b86d85-2bcb-426a-b891-b3d880ac7991-profile-collector-cert\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.347844 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac4ee1c6-2791-40d8-a167-611e3a8b855f-config\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.350141 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac4ee1c6-2791-40d8-a167-611e3a8b855f-serving-cert\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.350192 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-webhook-cert\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.351603 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-signing-cabundle\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.352481 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.353191 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f3b86d85-2bcb-426a-b891-b3d880ac7991-srv-cert\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.353362 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5022e6ca-d788-42fc-80ee-c4b3458b22f2-stats-auth\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.353581 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-config-volume\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.353956 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0583aa03-5210-45af-a727-9dd88a965f3b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-jkbqq\" (UID: \"0583aa03-5210-45af-a727-9dd88a965f3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.356227 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"openshift-service-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.357820 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/819df023-cb2d-4b15-a117-82de9af73281-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-mvlbc\" (UID: \"819df023-cb2d-4b15-a117-82de9af73281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.361145 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/160d9bf6-cd99-4000-b1b1-ea717f855b81-webhook-certs\") pod \"multus-admission-controller-69db94689b-lmbx9\" (UID: \"160d9bf6-cd99-4000-b1b1-ea717f855b81\") " pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.361250 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-secret-volume\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.362943 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-apiservice-cert\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.363182 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f9c953c5-c14d-4655-b632-0fbfa822316c-etcd-client\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.365235 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ead8b44-73b3-4f27-8b79-65548b623235-metrics-tls\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.366009 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.367754 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.378398 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"hostpath-provisioner\"/\"csi-hostpath-provisioner-sa-dockercfg-7dcws\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.397471 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"kube-root-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.417494 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-dockercfg-dzw6b\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.423397 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.424087 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:21.924060781 +0000 UTC m=+110.603262589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.427255 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" event={"ID":"4dc74e00-faf0-4e1f-9b82-4a03c34a8928","Type":"ContainerStarted","Data":"841b6aab14ebe478453bd9f0aa995635d554326bc03a9ad7f3355a0ceb7c1331"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.427308 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" event={"ID":"4dc74e00-faf0-4e1f-9b82-4a03c34a8928","Type":"ContainerStarted","Data":"eff82b1e73ebbed0f103d0f72ea108e97de7c9f78e31fa19dde08cd05b2bf3f1"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.427319 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" event={"ID":"4dc74e00-faf0-4e1f-9b82-4a03c34a8928","Type":"ContainerStarted","Data":"37df473a379745f92393ab4394a21de341412db161623e01b4a1208f88c8066c"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.437822 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"node-bootstrapper-token\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.440071 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" event={"ID":"6a3a94bc-1154-4d4f-b1f2-061c4b252e8f","Type":"ContainerStarted","Data":"a0f048d212b2852c47096fcc3b19b4c7a5d3121fb6ff2786bba358552c1aef0c"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.444825 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" event={"ID":"a6346413-47ab-40fc-bf6b-9315694ebaaa","Type":"ContainerStarted","Data":"c173e472e5649414e692ba97cb4b725f7ddc60b87c14465904ee236129367fe2"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.451678 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" event={"ID":"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96","Type":"ContainerStarted","Data":"1998cdb658d40e9764d8db04c9cfecb3ebe7938f143bd58df730d4a8a0f2e07e"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.457941 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-node-bootstrap-token\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.459724 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-tls\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.464569 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" event={"ID":"0ff30ccc-02d7-4687-921d-9f43931df261","Type":"ContainerStarted","Data":"d0c56b8c12f4f16a2dba96c3cef90416530203afc35aca20a28d4a06001976d5"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.466892 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-certs\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.473190 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-scmch" event={"ID":"aff96734-e4bb-4642-9207-9293cce04a16","Type":"ContainerStarted","Data":"60312f2cd42b5a0a32cf4a5cd2fee5a6f5cba58eaaea5d2be1bda7ab96db47d5"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.478126 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.479926 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" event={"ID":"a5a4136e-159f-4279-9632-b62ccf9824e1","Type":"ContainerStarted","Data":"6dde2e774ece99a97a0eb1c8879dbc4d66ae948f16e8ef057310d3daeff076d2"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.479991 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" event={"ID":"a5a4136e-159f-4279-9632-b62ccf9824e1","Type":"ContainerStarted","Data":"1807da44f42f7d44b040ead8baad000061cb0ba5caea995dc2f26023d694bdc5"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.490468 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05227425-db6e-47c1-98a1-a379abd65dbd-cert\") pod \"ingress-canary-ksh5l\" (UID: \"05227425-db6e-47c1-98a1-a379abd65dbd\") " pod="openshift-ingress-canary/ingress-canary-ksh5l" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.497241 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" event={"ID":"82a64db7-bf3c-410b-9772-e830bfa83118","Type":"ContainerStarted","Data":"fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.497293 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" event={"ID":"82a64db7-bf3c-410b-9772-e830bfa83118","Type":"ContainerStarted","Data":"fc084a296fdbbc4e6740bd3d1b4ec6d4b70cb6d3abd032769acdffe79e49105b"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.498272 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.501919 5200 patch_prober.go:28] interesting pod/route-controller-manager-776cdc94d6-685h9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.502000 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" podUID="82a64db7-bf3c-410b-9772-e830bfa83118" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.505611 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.520477 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" event={"ID":"6d2ce281-016f-4cb6-b51f-5fbdcd007f99","Type":"ContainerStarted","Data":"f4f61c64e793970777e86089c50b1dbccf07d75e5a1fbdca328043807bbe5cab"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.520948 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.529308 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" event={"ID":"b60d512c-29b7-4ac4-b63d-59a43f736b05","Type":"ContainerStarted","Data":"b18457794262cc4db680cd7af9fa2acd3bd008c52d9103bd302965df0f422c92"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.532727 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.534712 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" event={"ID":"7363e7ee-68c6-4682-acf4-95c6ab68965a","Type":"ContainerStarted","Data":"389ae3a4c3238d537eaecbca10034f7f7d6165fa1b1e74078afd8385241d5cc0"} Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.534773 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" event={"ID":"7363e7ee-68c6-4682-acf4-95c6ab68965a","Type":"ContainerStarted","Data":"2df86e0fd0198fb51c8ddf7fc15b289629fec93e1c1cb119e575185e61b081fb"} Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.535864 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.035846118 +0000 UTC m=+110.715047936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.536955 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pgs7\"" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.603955 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd4ed867-e8fc-46a0-8a90-23b95b55d9da-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-hs8rn\" (UID: \"cd4ed867-e8fc-46a0-8a90-23b95b55d9da\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.630656 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg727\" (UniqueName: \"kubernetes.io/projected/facd7487-680a-45aa-b205-ea98d33f5ac6-kube-api-access-jg727\") pod \"kube-storage-version-migrator-operator-565b79b866-snhf2\" (UID: \"facd7487-680a-45aa-b205-ea98d33f5ac6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.635128 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.645207 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.145161845 +0000 UTC m=+110.824363663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.669883 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-bound-sa-token\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.687044 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2zj\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-kube-api-access-7h2zj\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.690569 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hv5v\" (UniqueName: \"kubernetes.io/projected/1201bf36-1a87-438f-9140-a4c310e347ee-kube-api-access-4hv5v\") pod \"downloads-747b44746d-5sjbk\" (UID: \"1201bf36-1a87-438f-9140-a4c310e347ee\") " pod="openshift-console/downloads-747b44746d-5sjbk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.707200 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd82c\" (UniqueName: \"kubernetes.io/projected/6c5c2833-7cd0-4393-8af3-38aecce665c5-kube-api-access-xd82c\") pod \"openshift-controller-manager-operator-686468bdd5-mcj4w\" (UID: \"6c5c2833-7cd0-4393-8af3-38aecce665c5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.717931 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16667404-8d60-4bed-bd9e-e8e14dae10a0-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-bnssc\" (UID: \"16667404-8d60-4bed-bd9e-e8e14dae10a0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.737356 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.739148 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.739745 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.239728641 +0000 UTC m=+110.918930459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.751445 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm468\" (UniqueName: \"kubernetes.io/projected/e175f1f4-4647-4ca1-aecf-baab5eab2da1-kube-api-access-mm468\") pod \"machine-config-controller-f9cdd68f7-jtgks\" (UID: \"e175f1f4-4647-4ca1-aecf-baab5eab2da1\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.759295 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5pk\" (UniqueName: \"kubernetes.io/projected/06c119f7-e161-458b-9962-5b3593bfffda-kube-api-access-fw5pk\") pod \"machine-config-operator-67c9d58cbb-qrgss\" (UID: \"06c119f7-e161-458b-9962-5b3593bfffda\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.792753 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftqd2\" (UniqueName: \"kubernetes.io/projected/db9aeb0f-11fc-4e41-9ab0-c2a09854fc67-kube-api-access-ftqd2\") pod \"migrator-866fcbc849-q77rr\" (UID: \"db9aeb0f-11fc-4e41-9ab0-c2a09854fc67\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.795655 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z95mh\" (UniqueName: \"kubernetes.io/projected/5f70d661-ea0a-4bf6-84bd-35f35ffb8c43-kube-api-access-z95mh\") pod \"packageserver-7d4fc7d867-7s4qq\" (UID: \"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.795800 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.819250 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjnq\" (UniqueName: \"kubernetes.io/projected/05227425-db6e-47c1-98a1-a379abd65dbd-kube-api-access-ppjnq\") pod \"ingress-canary-ksh5l\" (UID: \"05227425-db6e-47c1-98a1-a379abd65dbd\") " pod="openshift-ingress-canary/ingress-canary-ksh5l" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.841256 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.841391 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.341359235 +0000 UTC m=+111.020561063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.842736 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.844139 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq6n6\" (UniqueName: \"kubernetes.io/projected/f9c953c5-c14d-4655-b632-0fbfa822316c-kube-api-access-tq6n6\") pod \"etcd-operator-69b85846b6-wn4bx\" (UID: \"f9c953c5-c14d-4655-b632-0fbfa822316c\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.844167 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.344152703 +0000 UTC m=+111.023354511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.853989 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-747b44746d-5sjbk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.859522 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.865969 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee8b88c7-3cb3-495a-a95f-458d98655950-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.867275 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.875789 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzqd6\" (UniqueName: \"kubernetes.io/projected/819df023-cb2d-4b15-a117-82de9af73281-kube-api-access-wzqd6\") pod \"package-server-manager-77f986bd66-mvlbc\" (UID: \"819df023-cb2d-4b15-a117-82de9af73281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.891883 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.897090 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktwg\" (UniqueName: \"kubernetes.io/projected/fad4c3f5-8f22-4fed-815e-6e9bb8db7a40-kube-api-access-mktwg\") pod \"catalog-operator-75ff9f647d-7dcwj\" (UID: \"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.915957 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.919068 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95nzq\" (UniqueName: \"kubernetes.io/projected/5056ea4a-2f50-4c28-95bf-bd0c21db14ea-kube-api-access-95nzq\") pod \"machine-config-server-twnv2\" (UID: \"5056ea4a-2f50-4c28-95bf-bd0c21db14ea\") " pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.924337 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.934138 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.944276 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.945904 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5ppc\" (UniqueName: \"kubernetes.io/projected/5022e6ca-d788-42fc-80ee-c4b3458b22f2-kube-api-access-h5ppc\") pod \"router-default-68cf44c8b8-cpmj7\" (UID: \"5022e6ca-d788-42fc-80ee-c4b3458b22f2\") " pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: E1126 13:32:21.947644 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.447612587 +0000 UTC m=+111.126814405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.948845 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.971541 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdx6\" (UniqueName: \"kubernetes.io/projected/1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba-kube-api-access-hbdx6\") pod \"csi-hostpathplugin-ctstk\" (UID: \"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba\") " pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.972909 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:21 crc kubenswrapper[5200]: I1126 13:32:21.997020 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.018094 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wpt4\" (UniqueName: \"kubernetes.io/projected/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-kube-api-access-5wpt4\") pod \"cni-sysctl-allowlist-ds-rnv6n\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.022782 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jjrn\" (UniqueName: \"kubernetes.io/projected/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-kube-api-access-6jjrn\") pod \"marketplace-operator-547dbd544d-gj489\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.034273 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4gx7\" (UniqueName: \"kubernetes.io/projected/0583aa03-5210-45af-a727-9dd88a965f3b-kube-api-access-j4gx7\") pod \"control-plane-machine-set-operator-75ffdb6fcd-jkbqq\" (UID: \"0583aa03-5210-45af-a727-9dd88a965f3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.034707 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ctstk" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.035399 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.041709 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slrfd\" (UniqueName: \"kubernetes.io/projected/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-kube-api-access-slrfd\") pod \"collect-profiles-29402730-z2qtr\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.041963 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ksh5l" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.052838 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.053272 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.553253503 +0000 UTC m=+111.232455321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.056292 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.079186 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lkqc\" (UniqueName: \"kubernetes.io/projected/7ead8b44-73b3-4f27-8b79-65548b623235-kube-api-access-5lkqc\") pod \"dns-default-lpc6x\" (UID: \"7ead8b44-73b3-4f27-8b79-65548b623235\") " pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.079849 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdznc\" (UniqueName: \"kubernetes.io/projected/ee8b88c7-3cb3-495a-a95f-458d98655950-kube-api-access-fdznc\") pod \"cluster-image-registry-operator-86c45576b9-lfbjb\" (UID: \"ee8b88c7-3cb3-495a-a95f-458d98655950\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.097820 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.098896 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fhp8\" (UniqueName: \"kubernetes.io/projected/ac4ee1c6-2791-40d8-a167-611e3a8b855f-kube-api-access-9fhp8\") pod \"service-ca-operator-5b9c976747-rnvc6\" (UID: \"ac4ee1c6-2791-40d8-a167-611e3a8b855f\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.099519 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.115082 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/626994d9-9c2a-4eb1-8383-090d3e64b27d-kube-api-access\") pod \"kube-apiserver-operator-575994946d-sz67t\" (UID: \"626994d9-9c2a-4eb1-8383-090d3e64b27d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.120382 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.133335 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-twnv2" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.136661 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrxh\" (UniqueName: \"kubernetes.io/projected/1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e-kube-api-access-7zrxh\") pod \"service-ca-74545575db-76hbn\" (UID: \"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e\") " pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.153985 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.154535 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.654505726 +0000 UTC m=+111.333707724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.154630 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.155967 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.655954126 +0000 UTC m=+111.335155944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.179755 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qr67\" (UniqueName: \"kubernetes.io/projected/160d9bf6-cd99-4000-b1b1-ea717f855b81-kube-api-access-9qr67\") pod \"multus-admission-controller-69db94689b-lmbx9\" (UID: \"160d9bf6-cd99-4000-b1b1-ea717f855b81\") " pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.196966 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4v28\" (UniqueName: \"kubernetes.io/projected/f3b86d85-2bcb-426a-b891-b3d880ac7991-kube-api-access-n4v28\") pod \"olm-operator-5cdf44d969-s9fc2\" (UID: \"f3b86d85-2bcb-426a-b891-b3d880ac7991\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.197939 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn"] Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.223307 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.244978 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-67c89758df-ksjj6" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.245820 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.246551 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-67c89758df-ksjj6" podStartSLOduration=20.24652805 podStartE2EDuration="20.24652805s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:22.205804341 +0000 UTC m=+110.885006169" watchObservedRunningTime="2025-11-26 13:32:22.24652805 +0000 UTC m=+110.925729868" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.256802 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.264029 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.264603 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.265236 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.765205043 +0000 UTC m=+111.444406861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.269316 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.277354 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-74545575db-76hbn" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.278849 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.292081 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.316378 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.324957 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" podStartSLOduration=21.324929534 podStartE2EDuration="21.324929534s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:22.321191769 +0000 UTC m=+111.000393587" watchObservedRunningTime="2025-11-26 13:32:22.324929534 +0000 UTC m=+111.004131352" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.366416 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.366851 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.866835316 +0000 UTC m=+111.546037134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.378424 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc"] Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.432445 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" podStartSLOduration=20.432411221 podStartE2EDuration="20.432411221s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:22.43129846 +0000 UTC m=+111.110500288" watchObservedRunningTime="2025-11-26 13:32:22.432411221 +0000 UTC m=+111.111613039" Nov 26 13:32:22 crc kubenswrapper[5200]: W1126 13:32:22.466649 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5022e6ca_d788_42fc_80ee_c4b3458b22f2.slice/crio-ed055592e32302ad09633291c45797ef93a3bbfda2334dbada2f88ad603d4814 WatchSource:0}: Error finding container ed055592e32302ad09633291c45797ef93a3bbfda2334dbada2f88ad603d4814: Status 404 returned error can't find the container with id ed055592e32302ad09633291c45797ef93a3bbfda2334dbada2f88ad603d4814 Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.467750 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.468200 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:22.968180301 +0000 UTC m=+111.647382119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.517948 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w"] Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.571401 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.572055 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.072030787 +0000 UTC m=+111.751232605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.589198 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" event={"ID":"b60d512c-29b7-4ac4-b63d-59a43f736b05","Type":"ContainerStarted","Data":"7641aa50e1fcbd696292a079420b73bda434ce04c1a56a693c6aad5c6a535ca3"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.589719 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" event={"ID":"b60d512c-29b7-4ac4-b63d-59a43f736b05","Type":"ContainerStarted","Data":"48604ebd5df28b4b610be096195d4e13c43fb3f6bc4cb40e8d72446c59ee9252"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.599020 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" event={"ID":"5022e6ca-d788-42fc-80ee-c4b3458b22f2","Type":"ContainerStarted","Data":"ed055592e32302ad09633291c45797ef93a3bbfda2334dbada2f88ad603d4814"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.600327 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" event={"ID":"cd4ed867-e8fc-46a0-8a90-23b95b55d9da","Type":"ContainerStarted","Data":"82efb1496dc459b0c0d91fa19e2fde36e0a77291b2e6635a3d341ac0bb7f447b"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.655789 5200 generic.go:358] "Generic (PLEG): container finished" podID="260a7f9c-0a16-4a9c-95f9-84ddef0bbf96" containerID="e923d6152cad46be5ec4175f681834040cd45a7eeb215c00f53536f21d3418ec" exitCode=0 Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.655993 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" event={"ID":"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96","Type":"ContainerDied","Data":"e923d6152cad46be5ec4175f681834040cd45a7eeb215c00f53536f21d3418ec"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.672862 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.690335 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.190301566 +0000 UTC m=+111.869503384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.723742 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx"] Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.727950 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-scmch" event={"ID":"aff96734-e4bb-4642-9207-9293cce04a16","Type":"ContainerStarted","Data":"084a15ae83b1d6dacb087fc12cec1492642e3e8228afd0f0aa664d6fbc4a86ee"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.728011 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-scmch" event={"ID":"aff96734-e4bb-4642-9207-9293cce04a16","Type":"ContainerStarted","Data":"9bbc3f29f950eaa4a69f4941655662ea86e47967660788bcb5994cc1325d0a5e"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.739455 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" event={"ID":"a5a4136e-159f-4279-9632-b62ccf9824e1","Type":"ContainerStarted","Data":"a831ee96fddb1eac5752770607693bd6fd38745e1f087194f0b50312d6a9c3fa"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.745709 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" event={"ID":"6d2ce281-016f-4cb6-b51f-5fbdcd007f99","Type":"ContainerStarted","Data":"2d47f7230cf8cfb9fd41e2d8110ea7ae040daf44e05feaca1315dd501a5ec3c0"} Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.777337 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.778630 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.278612197 +0000 UTC m=+111.957814015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.875229 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d44f6ddf-vvkzj" podStartSLOduration=20.875177198 podStartE2EDuration="20.875177198s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:22.874328955 +0000 UTC m=+111.553530773" watchObservedRunningTime="2025-11-26 13:32:22.875177198 +0000 UTC m=+111.554379016" Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.880386 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.881069 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.381039312 +0000 UTC m=+112.060241130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:22 crc kubenswrapper[5200]: I1126 13:32:22.982772 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:22 crc kubenswrapper[5200]: E1126 13:32:22.983213 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.48319634 +0000 UTC m=+112.162398158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.001304 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-5cj6f" podStartSLOduration=22.001280496 podStartE2EDuration="22.001280496s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:22.997181792 +0000 UTC m=+111.676383620" watchObservedRunningTime="2025-11-26 13:32:23.001280496 +0000 UTC m=+111.680482314" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.053236 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.084450 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.085286 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.585243805 +0000 UTC m=+112.264445624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.109757 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-747b44746d-5sjbk"] Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.188966 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.192924 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.692903358 +0000 UTC m=+112.372105176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.290797 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.291491 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.791467265 +0000 UTC m=+112.470669083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.397928 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.398341 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.898324675 +0000 UTC m=+112.577526493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.498795 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.499318 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:23.99929489 +0000 UTC m=+112.678496708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.514005 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jr48p" podStartSLOduration=22.513982821 podStartE2EDuration="22.513982821s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:23.512481189 +0000 UTC m=+112.191683007" watchObservedRunningTime="2025-11-26 13:32:23.513982821 +0000 UTC m=+112.193184639" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.551038 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-kxjqs" podStartSLOduration=22.551012617 podStartE2EDuration="22.551012617s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:23.550826701 +0000 UTC m=+112.230028529" watchObservedRunningTime="2025-11-26 13:32:23.551012617 +0000 UTC m=+112.230214435" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.603483 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.603970 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.103954438 +0000 UTC m=+112.783156256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.604798 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-755bb95488-g99bm" podStartSLOduration=21.604767101 podStartE2EDuration="21.604767101s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:23.602877188 +0000 UTC m=+112.282079006" watchObservedRunningTime="2025-11-26 13:32:23.604767101 +0000 UTC m=+112.283968919" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.634195 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-q2qxn" podStartSLOduration=21.634170413 podStartE2EDuration="21.634170413s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:23.62975731 +0000 UTC m=+112.308959128" watchObservedRunningTime="2025-11-26 13:32:23.634170413 +0000 UTC m=+112.313372251" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.677474 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" podStartSLOduration=21.677447844 podStartE2EDuration="21.677447844s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:23.676907329 +0000 UTC m=+112.356109157" watchObservedRunningTime="2025-11-26 13:32:23.677447844 +0000 UTC m=+112.356649662" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.705277 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.705936 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.205917921 +0000 UTC m=+112.885119739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.739417 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qmpmk" podStartSLOduration=21.739395377 podStartE2EDuration="21.739395377s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:23.738112061 +0000 UTC m=+112.417313899" watchObservedRunningTime="2025-11-26 13:32:23.739395377 +0000 UTC m=+112.418597195" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.798824 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-twnv2" event={"ID":"5056ea4a-2f50-4c28-95bf-bd0c21db14ea","Type":"ContainerStarted","Data":"5af5c340d2ca26fdb8ab716259c5edf18542f582251c26acc51d9bd43f0f5e4e"} Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.801085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" event={"ID":"16667404-8d60-4bed-bd9e-e8e14dae10a0","Type":"ContainerStarted","Data":"ee6f9e1f9f4ddd4818246a344e186366c2c1283863179e7ee0f794af074aebfe"} Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.809306 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.809653 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.309640013 +0000 UTC m=+112.988841831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.810601 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" event={"ID":"6d2ce281-016f-4cb6-b51f-5fbdcd007f99","Type":"ContainerStarted","Data":"2f593873d891d246306d1563f7279a8c2ca9c7443667a632d3ae0a5c91346a11"} Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.821092 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" event={"ID":"6c5c2833-7cd0-4393-8af3-38aecce665c5","Type":"ContainerStarted","Data":"c8bb0c5b33766f0adb53442786b0860514145e987583fbaf4b6e0efc1279dd04"} Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.855059 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" event={"ID":"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686","Type":"ContainerStarted","Data":"37c444c31037429a6d335cc2ad1e4fcc2499a842a17237671cc3b4f40bdc444c"} Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.864439 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.865081 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.883786 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" event={"ID":"a6346413-47ab-40fc-bf6b-9315694ebaaa","Type":"ContainerStarted","Data":"9f54c48d47ed708270bd3759d86e64f67b3ad736116e8208a6b675aef56c725d"} Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.907851 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.908783 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" podStartSLOduration=21.908752745 podStartE2EDuration="21.908752745s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:23.873554581 +0000 UTC m=+112.552756409" watchObservedRunningTime="2025-11-26 13:32:23.908752745 +0000 UTC m=+112.587954563" Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.912135 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:23 crc kubenswrapper[5200]: E1126 13:32:23.913929 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.413886819 +0000 UTC m=+113.093088637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.935130 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" event={"ID":"f9c953c5-c14d-4655-b632-0fbfa822316c","Type":"ContainerStarted","Data":"901bfc138c7c4412631d71dbc8fd4857ba754d9b32ca1fb891576357e709ca54"} Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.944504 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-747b44746d-5sjbk" event={"ID":"1201bf36-1a87-438f-9140-a4c310e347ee","Type":"ContainerStarted","Data":"3219cbe48efc185748cf4a0309fca5f2c5ad021363cdc14c66e41571a87bdd0f"} Nov 26 13:32:23 crc kubenswrapper[5200]: I1126 13:32:23.975955 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-8596bd845d-bjpbs" Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.003187 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" podStartSLOduration=23.003162997 podStartE2EDuration="23.003162997s" podCreationTimestamp="2025-11-26 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:23.995503433 +0000 UTC m=+112.674705251" watchObservedRunningTime="2025-11-26 13:32:24.003162997 +0000 UTC m=+112.682364815" Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.014536 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.017379 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.517358544 +0000 UTC m=+113.196560362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.092907 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.119336 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.120937 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.620913741 +0000 UTC m=+113.300115559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.154663 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-799b87ffcd-g7wb6" podStartSLOduration=22.154643075 podStartE2EDuration="22.154643075s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:24.153211275 +0000 UTC m=+112.832413093" watchObservedRunningTime="2025-11-26 13:32:24.154643075 +0000 UTC m=+112.833844883" Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.167431 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.221742 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.222127 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.722111893 +0000 UTC m=+113.401313711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.227854 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.229604 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.278729 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-scmch" podStartSLOduration=22.277675487 podStartE2EDuration="22.277675487s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:24.27169878 +0000 UTC m=+112.950900608" watchObservedRunningTime="2025-11-26 13:32:24.277675487 +0000 UTC m=+112.956877305" Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.323439 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.323761 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.823736526 +0000 UTC m=+113.502938334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.376625 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ksh5l"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.381768 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.410038 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.425812 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.426032 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr"] Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.426297 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:24.926279965 +0000 UTC m=+113.605481783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.433032 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-74545575db-76hbn"] Nov 26 13:32:24 crc kubenswrapper[5200]: W1126 13:32:24.505157 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb9aeb0f_11fc_4e41_9ab0_c2a09854fc67.slice/crio-da4c2ff976e422b43b8f7508b7f43030c9dc9b59d94d68f9b8ade93d4c151f14 WatchSource:0}: Error finding container da4c2ff976e422b43b8f7508b7f43030c9dc9b59d94d68f9b8ade93d4c151f14: Status 404 returned error can't find the container with id da4c2ff976e422b43b8f7508b7f43030c9dc9b59d94d68f9b8ade93d4c151f14 Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.507780 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.519145 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.530512 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.531188 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.031163229 +0000 UTC m=+113.710365037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.618822 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.619412 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.633227 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.646525 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.133670907 +0000 UTC m=+113.812872725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.714260 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.728155 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.728230 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.734445 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.748283 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.748552 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.24850646 +0000 UTC m=+113.927708278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.749810 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.750352 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.250332251 +0000 UTC m=+113.929534069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.796630 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lpc6x"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.797098 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-lmbx9"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.797119 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ctstk"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.803459 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-gj489"] Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.851369 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.851803 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.35178583 +0000 UTC m=+114.030987638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: W1126 13:32:24.891543 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod626994d9_9c2a_4eb1_8383_090d3e64b27d.slice/crio-d1115e66bad3fd8302b8b2a9dde67a52b3897c5295d9a81a1c583c92a4ad1440 WatchSource:0}: Error finding container d1115e66bad3fd8302b8b2a9dde67a52b3897c5295d9a81a1c583c92a4ad1440: Status 404 returned error can't find the container with id d1115e66bad3fd8302b8b2a9dde67a52b3897c5295d9a81a1c583c92a4ad1440 Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.953947 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:24 crc kubenswrapper[5200]: E1126 13:32:24.954627 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.454602716 +0000 UTC m=+114.133804534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.994253 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" event={"ID":"16667404-8d60-4bed-bd9e-e8e14dae10a0","Type":"ContainerStarted","Data":"25790b099256cbceb7b3ae3f6ab318112956c326ff205231382009ca0c9160d0"} Nov 26 13:32:24 crc kubenswrapper[5200]: I1126 13:32:24.994302 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" event={"ID":"db9aeb0f-11fc-4e41-9ab0-c2a09854fc67","Type":"ContainerStarted","Data":"da4c2ff976e422b43b8f7508b7f43030c9dc9b59d94d68f9b8ade93d4c151f14"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.003943 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" event={"ID":"06c119f7-e161-458b-9962-5b3593bfffda","Type":"ContainerStarted","Data":"ec949c9875db2ae6efc738c42043aaa8c396c7d4e446ae12d45e0f240c70da88"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.016516 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-bnssc" podStartSLOduration=23.016492958 podStartE2EDuration="23.016492958s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.014487382 +0000 UTC m=+113.693689200" watchObservedRunningTime="2025-11-26 13:32:25.016492958 +0000 UTC m=+113.695694776" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.040570 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" event={"ID":"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43","Type":"ContainerStarted","Data":"4bdf6747de10e5ff37e36947ce463e9f83c0afed77509559aa16d7c95f3139b8"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.040653 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" event={"ID":"5f70d661-ea0a-4bf6-84bd-35f35ffb8c43","Type":"ContainerStarted","Data":"ad92b395809227297c403dc094e4a537712083bbeb8583c49274db25d4a02634"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.051228 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.053647 5200 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-7s4qq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.053767 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" podUID="5f70d661-ea0a-4bf6-84bd-35f35ffb8c43" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.054853 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.055121 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.555096118 +0000 UTC m=+114.234297936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.055356 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.055743 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.555735436 +0000 UTC m=+114.234937254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.077484 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" event={"ID":"5022e6ca-d788-42fc-80ee-c4b3458b22f2","Type":"ContainerStarted","Data":"40bfb45e3ef429e59bf1b4a0d222702e287a0926d3964d1cacc07be1cb618316"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.079597 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" podStartSLOduration=23.079588443 podStartE2EDuration="23.079588443s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.079093229 +0000 UTC m=+113.758295047" watchObservedRunningTime="2025-11-26 13:32:25.079588443 +0000 UTC m=+113.758790261" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.101509 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" event={"ID":"6c5c2833-7cd0-4393-8af3-38aecce665c5","Type":"ContainerStarted","Data":"ff38028e4dbb5b892b5439fc6656674921129317cca4992c22faf051fafaaba2"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.112777 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podStartSLOduration=23.112756711 podStartE2EDuration="23.112756711s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.112198175 +0000 UTC m=+113.791400003" watchObservedRunningTime="2025-11-26 13:32:25.112756711 +0000 UTC m=+113.791958529" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.116572 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-74545575db-76hbn" event={"ID":"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e","Type":"ContainerStarted","Data":"d7b6f0476aaeb909028f2980b060a5f97e7af4eccc332193a6847e4f457dff67"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.129445 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ksh5l" event={"ID":"05227425-db6e-47c1-98a1-a379abd65dbd","Type":"ContainerStarted","Data":"6b743081dc7e1cb561ffc29428f0e4bd9c51de1eb1d9d7c386c32a2d93703622"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.129518 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ksh5l" event={"ID":"05227425-db6e-47c1-98a1-a379abd65dbd","Type":"ContainerStarted","Data":"3baf5af421219315c13c7be500db47350214908bff45d6fd23da0c4060dbe1b0"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.161582 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.161764 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.661735721 +0000 UTC m=+114.340937539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.161953 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.162091 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" event={"ID":"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686","Type":"ContainerStarted","Data":"281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43"} Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.162360 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.662352519 +0000 UTC m=+114.341554327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.165346 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" event={"ID":"e175f1f4-4647-4ca1-aecf-baab5eab2da1","Type":"ContainerStarted","Data":"588ffbd17559301947edbab7fa26d71c43d53ae94e50d09713bce2614a03528e"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.166572 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" event={"ID":"160d9bf6-cd99-4000-b1b1-ea717f855b81","Type":"ContainerStarted","Data":"1eb5f86d6689c28fa999a5625bb984e186b587b1fd1024c726dc0318b098cb12"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.177547 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" event={"ID":"819df023-cb2d-4b15-a117-82de9af73281","Type":"ContainerStarted","Data":"534b5c5190d4f10b0697cf340aeee1407a1ee149469156159460b1cf30ce96cd"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.177613 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" event={"ID":"819df023-cb2d-4b15-a117-82de9af73281","Type":"ContainerStarted","Data":"9e3d6a2acdaa94d0d6c5e888d25c6d2f56f30072a2a200f95756a9795bc46472"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.200667 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.200999 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" event={"ID":"cd4ed867-e8fc-46a0-8a90-23b95b55d9da","Type":"ContainerStarted","Data":"bf22caf5c3f4d0e0f165620505602afad3117bd3f40457ca402b70bdca428572"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.219686 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" event={"ID":"f9c953c5-c14d-4655-b632-0fbfa822316c","Type":"ContainerStarted","Data":"a90a82e1f0f175887f5edcb6a0f08984106bebfc40ea96dc3bb0a613982ebcd8"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.228033 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ksh5l" podStartSLOduration=8.228001974 podStartE2EDuration="8.228001974s" podCreationTimestamp="2025-11-26 13:32:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.226737719 +0000 UTC m=+113.905939537" watchObservedRunningTime="2025-11-26 13:32:25.228001974 +0000 UTC m=+113.907203792" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.239860 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.257031 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-747b44746d-5sjbk" event={"ID":"1201bf36-1a87-438f-9140-a4c310e347ee","Type":"ContainerStarted","Data":"3745bf0fbd88006249d247f1a2eef3d00936bd70e7f4a9afd65069edcb7ef4d7"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.264089 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.266672 5200 patch_prober.go:28] interesting pod/downloads-747b44746d-5sjbk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.266782 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-5sjbk" podUID="1201bf36-1a87-438f-9140-a4c310e347ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.267212 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-747b44746d-5sjbk" Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.270117 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.770084122 +0000 UTC m=+114.449285940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.277569 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lpc6x" event={"ID":"7ead8b44-73b3-4f27-8b79-65548b623235","Type":"ContainerStarted","Data":"ee49cc65df0dd4579bd9969474ff50f90050ffd58360ffdbe45a1a1411cdd5e6"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.301535 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-mcj4w" podStartSLOduration=23.301503591 podStartE2EDuration="23.301503591s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.27716861 +0000 UTC m=+113.956370438" watchObservedRunningTime="2025-11-26 13:32:25.301503591 +0000 UTC m=+113.980705409" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.303485 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" event={"ID":"0583aa03-5210-45af-a727-9dd88a965f3b","Type":"ContainerStarted","Data":"9cd608490a0fb32e85bcde1b7ea406ed0ff255be44581125b1b4a659784193ce"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.326256 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" event={"ID":"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40","Type":"ContainerStarted","Data":"25cc5aba25162b0c36eb795514fd45b619c7c16062e069a7121619b090f779c9"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.326311 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" event={"ID":"fad4c3f5-8f22-4fed-815e-6e9bb8db7a40","Type":"ContainerStarted","Data":"66a4789cabc85b1f6558768f96faae1f28e7aff903746e3171e516934f7f189a"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.330830 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.334907 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-74545575db-76hbn" podStartSLOduration=23.334865234 podStartE2EDuration="23.334865234s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.330249335 +0000 UTC m=+114.009451153" watchObservedRunningTime="2025-11-26 13:32:25.334865234 +0000 UTC m=+114.014067052" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.356019 5200 patch_prober.go:28] interesting pod/catalog-operator-75ff9f647d-7dcwj container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.356134 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" podUID="fad4c3f5-8f22-4fed-815e-6e9bb8db7a40" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.357940 5200 patch_prober.go:28] interesting pod/apiserver-9ddfb9f55-54qvp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]log ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]etcd ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/generic-apiserver-start-informers ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/max-in-flight-filter ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 26 13:32:25 crc kubenswrapper[5200]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 26 13:32:25 crc kubenswrapper[5200]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/project.openshift.io-projectcache ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/openshift.io-startinformers ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 26 13:32:25 crc kubenswrapper[5200]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 26 13:32:25 crc kubenswrapper[5200]: livez check failed Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.358193 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" podUID="a6346413-47ab-40fc-bf6b-9315694ebaaa" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.367423 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" event={"ID":"260a7f9c-0a16-4a9c-95f9-84ddef0bbf96","Type":"ContainerStarted","Data":"0ddfeb101e7675a09ad41755c19f5b98dfa72283eb83f0c61e2495731c649e3a"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.367566 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.368501 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.374558 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.874534874 +0000 UTC m=+114.553736692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.402304 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-twnv2" event={"ID":"5056ea4a-2f50-4c28-95bf-bd0c21db14ea","Type":"ContainerStarted","Data":"7b671ddce3dd5cbc851cd389abaa144cb9d7c268aa2cbc2d0f2c121d51dc3a9c"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.413526 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-747b44746d-5sjbk" podStartSLOduration=23.413486774 podStartE2EDuration="23.413486774s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.364455382 +0000 UTC m=+114.043657210" watchObservedRunningTime="2025-11-26 13:32:25.413486774 +0000 UTC m=+114.092688592" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.414672 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" event={"ID":"626994d9-9c2a-4eb1-8383-090d3e64b27d","Type":"ContainerStarted","Data":"d1115e66bad3fd8302b8b2a9dde67a52b3897c5295d9a81a1c583c92a4ad1440"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.440380 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" event={"ID":"f3b86d85-2bcb-426a-b891-b3d880ac7991","Type":"ContainerStarted","Data":"846c8ae1b291164d952d8943fe5541ed6301514b5b2271dca4d06cfc64371503"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.458304 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.464265 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" event={"ID":"ee8b88c7-3cb3-495a-a95f-458d98655950","Type":"ContainerStarted","Data":"e268e3eeeec9ea91064cbff3d6a91e8a9821151eeca60414965c27c19bd8b6e1"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.466094 5200 patch_prober.go:28] interesting pod/olm-operator-5cdf44d969-s9fc2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.466183 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" podUID="f3b86d85-2bcb-426a-b891-b3d880ac7991" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.471231 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-hs8rn" podStartSLOduration=23.471210469 podStartE2EDuration="23.471210469s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.469940543 +0000 UTC m=+114.149142381" watchObservedRunningTime="2025-11-26 13:32:25.471210469 +0000 UTC m=+114.150412287" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.473563 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-69b85846b6-wn4bx" podStartSLOduration=23.473552334 podStartE2EDuration="23.473552334s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.439509722 +0000 UTC m=+114.118711550" watchObservedRunningTime="2025-11-26 13:32:25.473552334 +0000 UTC m=+114.152754152" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.474520 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.474895 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:25.974865131 +0000 UTC m=+114.654066949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.487087 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" event={"ID":"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b","Type":"ContainerStarted","Data":"19b99eaa5ebfb7580ff843142da87c31eb6a9861df4f2062d34fa63d7cb81d2d"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.498543 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" podStartSLOduration=8.498526653 podStartE2EDuration="8.498526653s" podCreationTimestamp="2025-11-26 13:32:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.497112003 +0000 UTC m=+114.176313821" watchObservedRunningTime="2025-11-26 13:32:25.498526653 +0000 UTC m=+114.177728471" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.520838 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" event={"ID":"facd7487-680a-45aa-b205-ea98d33f5ac6","Type":"ContainerStarted","Data":"010e738caf410230880f6dd051979461fd111cf8d9c4e7845162b0a1dc45e197"} Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.543548 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" podStartSLOduration=23.543532262 podStartE2EDuration="23.543532262s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.533609714 +0000 UTC m=+114.212811542" watchObservedRunningTime="2025-11-26 13:32:25.543532262 +0000 UTC m=+114.222734080" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.545310 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rnv6n"] Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.567387 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" podStartSLOduration=23.567337288 podStartE2EDuration="23.567337288s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.562144783 +0000 UTC m=+114.241346611" watchObservedRunningTime="2025-11-26 13:32:25.567337288 +0000 UTC m=+114.246539096" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.577653 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.579543 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.079512359 +0000 UTC m=+114.758714337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.605303 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" podStartSLOduration=23.605153506 podStartE2EDuration="23.605153506s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.599316983 +0000 UTC m=+114.278518801" watchObservedRunningTime="2025-11-26 13:32:25.605153506 +0000 UTC m=+114.284355334" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.679676 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.680100 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.180052922 +0000 UTC m=+114.859254750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.680748 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.683725 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.183709104 +0000 UTC m=+114.862910922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.784019 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.784238 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.284207866 +0000 UTC m=+114.963409684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.784635 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.785135 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.285123241 +0000 UTC m=+114.964325059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.887859 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.888151 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.388135333 +0000 UTC m=+115.067337151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.975354 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.987369 5200 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-cpmj7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 13:32:25 crc kubenswrapper[5200]: [-]has-synced failed: reason withheld Nov 26 13:32:25 crc kubenswrapper[5200]: [+]process-running ok Nov 26 13:32:25 crc kubenswrapper[5200]: healthz check failed Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.987492 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podUID="5022e6ca-d788-42fc-80ee-c4b3458b22f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:25 crc kubenswrapper[5200]: I1126 13:32:25.990260 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:25 crc kubenswrapper[5200]: E1126 13:32:25.990755 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.490741024 +0000 UTC m=+115.169942842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.093473 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.093661 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.593624382 +0000 UTC m=+115.272826200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.093908 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.094569 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.594560429 +0000 UTC m=+115.273762247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.196125 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.196519 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.6964945 +0000 UTC m=+115.375696318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.298365 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.298844 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.798823403 +0000 UTC m=+115.478025211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.400669 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.401343 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:26.90126827 +0000 UTC m=+115.580470088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.503219 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.503680 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.003660834 +0000 UTC m=+115.682862652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.553287 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-74545575db-76hbn" event={"ID":"1f358c1d-6b4a-4dce-8d90-d30e86ad2e1e","Type":"ContainerStarted","Data":"520a82d018b5a3030fe0b9f56b94e4986980164afdd863af437a80f90808412b"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.564671 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" event={"ID":"160d9bf6-cd99-4000-b1b1-ea717f855b81","Type":"ContainerStarted","Data":"3a903d972b9cfb04ce51bd93747bb34d949e2d8b685e4ebb3bdda3671439c610"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.585874 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" event={"ID":"819df023-cb2d-4b15-a117-82de9af73281","Type":"ContainerStarted","Data":"462c773ec86635faa187d7968984069b75eb5e421cabffb7649c25d13b77bab5"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.586786 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.604221 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" event={"ID":"0583aa03-5210-45af-a727-9dd88a965f3b","Type":"ContainerStarted","Data":"f5c6a398f63637d381be3f927860ed6bcd5e2eaa10894e91a9fafb9e68bbe251"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.605172 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.605464 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.105414851 +0000 UTC m=+115.784616669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.606190 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.606592 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.106581864 +0000 UTC m=+115.785783682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.617108 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" event={"ID":"626994d9-9c2a-4eb1-8383-090d3e64b27d","Type":"ContainerStarted","Data":"372dba666b9b1b2bc0e2a60da9e9ca72d8bf3dec2d0feffceafd8334afba4fb2"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.628034 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-twnv2" podStartSLOduration=9.628021354 podStartE2EDuration="9.628021354s" podCreationTimestamp="2025-11-26 13:32:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:25.625817314 +0000 UTC m=+114.305019132" watchObservedRunningTime="2025-11-26 13:32:26.628021354 +0000 UTC m=+115.307223172" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.636501 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" event={"ID":"ee8b88c7-3cb3-495a-a95f-458d98655950","Type":"ContainerStarted","Data":"614ced587031243b0695ec1a6083b1e888794fcb22e6f4996b9b5feb2d6d4758"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.639567 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ctstk" event={"ID":"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba","Type":"ContainerStarted","Data":"ee6761563670e5c32467eb8782b6d94c2727e899092c5856f2e5dedc6c3f4baa"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.641890 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" event={"ID":"10b6ac0f-ebd5-49ff-a2d7-f729135c410e","Type":"ContainerStarted","Data":"fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.641918 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" event={"ID":"10b6ac0f-ebd5-49ff-a2d7-f729135c410e","Type":"ContainerStarted","Data":"fb9b1c2ebaf22a1777f90baca390a0cc513ff626386ff319fa5e28643e0daf3a"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.642698 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.644321 5200 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-gj489 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.644397 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.648783 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" event={"ID":"e175f1f4-4647-4ca1-aecf-baab5eab2da1","Type":"ContainerStarted","Data":"126a8d5e99ee00b931180d41f76d43d1e2c325adc74e5eccd8e7cdbe17f74e69"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.648813 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" event={"ID":"e175f1f4-4647-4ca1-aecf-baab5eab2da1","Type":"ContainerStarted","Data":"b186b5f4b6df0d94d7ed24a1a04f7014ee11fdc38474c359cdca297d92d93884"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.670595 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lpc6x" event={"ID":"7ead8b44-73b3-4f27-8b79-65548b623235","Type":"ContainerStarted","Data":"71974dd8efebbda36054e4c831f7b95a5bc78825ed26459468ee6546f558f1c2"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.671933 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" podStartSLOduration=24.671889431 podStartE2EDuration="24.671889431s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:26.62895396 +0000 UTC m=+115.308155788" watchObservedRunningTime="2025-11-26 13:32:26.671889431 +0000 UTC m=+115.351091249" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.673137 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-sz67t" podStartSLOduration=24.673130056 podStartE2EDuration="24.673130056s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:26.670948325 +0000 UTC m=+115.350150143" watchObservedRunningTime="2025-11-26 13:32:26.673130056 +0000 UTC m=+115.352331874" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.673659 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" event={"ID":"ac4ee1c6-2791-40d8-a167-611e3a8b855f","Type":"ContainerStarted","Data":"d53bac9727d5a6d788020b0375169729beecfec7340a43be01c80d25fd1943b3"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.673723 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" event={"ID":"ac4ee1c6-2791-40d8-a167-611e3a8b855f","Type":"ContainerStarted","Data":"f2cd5dc64723e221c7dac4b583ded9850efca4be0418824b6553aa13d6470712"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.707963 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.710448 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.210417449 +0000 UTC m=+115.889619267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.721183 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" event={"ID":"f3b86d85-2bcb-426a-b891-b3d880ac7991","Type":"ContainerStarted","Data":"776073ec4d8908e39855833ea025f2eda0ec76ec1060e0183b63817e5b3a6707"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.731264 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-s9fc2" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.735190 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-jkbqq" podStartSLOduration=24.735179732 podStartE2EDuration="24.735179732s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:26.735017827 +0000 UTC m=+115.414219655" watchObservedRunningTime="2025-11-26 13:32:26.735179732 +0000 UTC m=+115.414381550" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.742534 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" event={"ID":"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b","Type":"ContainerStarted","Data":"59c7ab3a4317aad68d35814b5556ab11f38fcfec9ed49926ea289412ef6f7b29"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.769534 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" event={"ID":"facd7487-680a-45aa-b205-ea98d33f5ac6","Type":"ContainerStarted","Data":"e6fddb4d18f90a573e7ef62aa24b7a98ed6b4cc51afa494c24d26843a49075a6"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.803891 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" event={"ID":"db9aeb0f-11fc-4e41-9ab0-c2a09854fc67","Type":"ContainerStarted","Data":"7de29316a133a88193631eeedaccce211705f8d1bdd953a8b3f478f2684dbecb"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.803941 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" event={"ID":"db9aeb0f-11fc-4e41-9ab0-c2a09854fc67","Type":"ContainerStarted","Data":"97cb3c925ac133760b4c5bb74b66e0ff3b0174eda367421861c88731ccf7183b"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.810139 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.811668 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.311653081 +0000 UTC m=+115.990855099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.831985 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" event={"ID":"06c119f7-e161-458b-9962-5b3593bfffda","Type":"ContainerStarted","Data":"8417ddb35aebe970b7372de7816450712ee858e9c82a1a915ac1e4b0bf7e4c79"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.832042 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" event={"ID":"06c119f7-e161-458b-9962-5b3593bfffda","Type":"ContainerStarted","Data":"e33f6236b775aede80592d068615b7e1e9546fc060bb8a840819757a946d998f"} Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.833323 5200 patch_prober.go:28] interesting pod/downloads-747b44746d-5sjbk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.833387 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-5sjbk" podUID="1201bf36-1a87-438f-9140-a4c310e347ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.841739 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-7dcwj" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.851266 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-lfbjb" podStartSLOduration=24.851251549 podStartE2EDuration="24.851251549s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:26.808907284 +0000 UTC m=+115.488109102" watchObservedRunningTime="2025-11-26 13:32:26.851251549 +0000 UTC m=+115.530453367" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.852708 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-jtgks" podStartSLOduration=24.85270176 podStartE2EDuration="24.85270176s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:26.849929602 +0000 UTC m=+115.529131430" watchObservedRunningTime="2025-11-26 13:32:26.85270176 +0000 UTC m=+115.531903568" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.876994 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-rnvc6" podStartSLOduration=24.876953778 podStartE2EDuration="24.876953778s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:26.875771775 +0000 UTC m=+115.554973673" watchObservedRunningTime="2025-11-26 13:32:26.876953778 +0000 UTC m=+115.556155586" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.906052 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" podStartSLOduration=24.906014491 podStartE2EDuration="24.906014491s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:26.904502659 +0000 UTC m=+115.583704477" watchObservedRunningTime="2025-11-26 13:32:26.906014491 +0000 UTC m=+115.585216299" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.912236 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.913120 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.413097229 +0000 UTC m=+116.092299047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.913579 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:26 crc kubenswrapper[5200]: E1126 13:32:26.927746 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.427726499 +0000 UTC m=+116.106928317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.982058 5200 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-cpmj7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 13:32:26 crc kubenswrapper[5200]: [-]has-synced failed: reason withheld Nov 26 13:32:26 crc kubenswrapper[5200]: [+]process-running ok Nov 26 13:32:26 crc kubenswrapper[5200]: healthz check failed Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.982147 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podUID="5022e6ca-d788-42fc-80ee-c4b3458b22f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:26 crc kubenswrapper[5200]: I1126 13:32:26.988658 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" podStartSLOduration=24.988617302 podStartE2EDuration="24.988617302s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:26.979905449 +0000 UTC m=+115.659107277" watchObservedRunningTime="2025-11-26 13:32:26.988617302 +0000 UTC m=+115.667819120" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.014641 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.015007 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.51498422 +0000 UTC m=+116.194186038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.076422 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-q77rr" podStartSLOduration=25.076397488 podStartE2EDuration="25.076397488s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:27.074859725 +0000 UTC m=+115.754061543" watchObservedRunningTime="2025-11-26 13:32:27.076397488 +0000 UTC m=+115.755599306" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.093380 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-snhf2" podStartSLOduration=25.093333642 podStartE2EDuration="25.093333642s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:27.047826259 +0000 UTC m=+115.727028097" watchObservedRunningTime="2025-11-26 13:32:27.093333642 +0000 UTC m=+115.772535460" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.107448 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-5777786469-krpxs" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.118022 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.118583 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.618557718 +0000 UTC m=+116.297759716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.197855 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-qrgss" podStartSLOduration=25.197829936 podStartE2EDuration="25.197829936s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:27.195007797 +0000 UTC m=+115.874209615" watchObservedRunningTime="2025-11-26 13:32:27.197829936 +0000 UTC m=+115.877031754" Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.221239 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.72120802 +0000 UTC m=+116.400409848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.221101 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.221489 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.222039 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.722014202 +0000 UTC m=+116.401216020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.324771 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.325419 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.825388434 +0000 UTC m=+116.504590252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.426738 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.427137 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:27.927115491 +0000 UTC m=+116.606317499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.528389 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.528602 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.028569509 +0000 UTC m=+116.707771327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.529152 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.529635 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.029616088 +0000 UTC m=+116.708817906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.574962 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djxct"] Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.584265 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.591471 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.609615 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djxct"] Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.637386 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.137350492 +0000 UTC m=+116.816552490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.637443 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.637841 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p4w7\" (UniqueName: \"kubernetes.io/projected/08ffe275-d9f0-491b-bc0a-ae428458c956-kube-api-access-7p4w7\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.637934 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-catalog-content\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.638258 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-utilities\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.638496 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.639053 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.13903607 +0000 UTC m=+116.818237888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.711962 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-7s4qq" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.740058 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.740289 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.240248841 +0000 UTC m=+116.919450659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.740390 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p4w7\" (UniqueName: \"kubernetes.io/projected/08ffe275-d9f0-491b-bc0a-ae428458c956-kube-api-access-7p4w7\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.740454 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-catalog-content\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.740666 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-utilities\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.741248 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-catalog-content\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.741413 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-utilities\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.788159 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8vhwf"] Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.796385 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.804416 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.806817 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p4w7\" (UniqueName: \"kubernetes.io/projected/08ffe275-d9f0-491b-bc0a-ae428458c956-kube-api-access-7p4w7\") pod \"community-operators-djxct\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.807537 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vhwf"] Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.843745 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-catalog-content\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.843791 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-utilities\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.843907 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.844269 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flc5h\" (UniqueName: \"kubernetes.io/projected/65b299e0-e301-40f7-a74c-d7403b784d36-kube-api-access-flc5h\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.844347 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.344288592 +0000 UTC m=+117.023490590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.865266 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lpc6x" event={"ID":"7ead8b44-73b3-4f27-8b79-65548b623235","Type":"ContainerStarted","Data":"17d2acb053af96914d518bef1e7b27effb222e15eb80515662e4ae3ed652b214"} Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.865543 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.868506 5200 generic.go:358] "Generic (PLEG): container finished" podID="55d5f3d3-1ccb-4a17-88ba-c76273c02a8b" containerID="59c7ab3a4317aad68d35814b5556ab11f38fcfec9ed49926ea289412ef6f7b29" exitCode=0 Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.868614 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" event={"ID":"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b","Type":"ContainerDied","Data":"59c7ab3a4317aad68d35814b5556ab11f38fcfec9ed49926ea289412ef6f7b29"} Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.871948 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" event={"ID":"160d9bf6-cd99-4000-b1b1-ea717f855b81","Type":"ContainerStarted","Data":"46c4ca767bafe85eb5b81d384d2f3918c7230d0a3bafbcc16103cc2d1dab1435"} Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.875453 5200 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-gj489 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.875524 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.876188 5200 patch_prober.go:28] interesting pod/downloads-747b44746d-5sjbk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.876251 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-5sjbk" podUID="1201bf36-1a87-438f-9140-a4c310e347ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.876399 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" podUID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" gracePeriod=30 Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.891874 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lpc6x" podStartSLOduration=10.891847743 podStartE2EDuration="10.891847743s" podCreationTimestamp="2025-11-26 13:32:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:27.891244406 +0000 UTC m=+116.570446234" watchObservedRunningTime="2025-11-26 13:32:27.891847743 +0000 UTC m=+116.571049571" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.899967 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djxct" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.921390 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-69db94689b-lmbx9" podStartSLOduration=25.921364828 podStartE2EDuration="25.921364828s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:27.920633278 +0000 UTC m=+116.599835106" watchObservedRunningTime="2025-11-26 13:32:27.921364828 +0000 UTC m=+116.600566646" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.945851 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:27 crc kubenswrapper[5200]: E1126 13:32:27.946971 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.446948654 +0000 UTC m=+117.126150472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.947076 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-catalog-content\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.947125 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-utilities\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.947546 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.947898 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flc5h\" (UniqueName: \"kubernetes.io/projected/65b299e0-e301-40f7-a74c-d7403b784d36-kube-api-access-flc5h\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.950134 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-utilities\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:27 crc kubenswrapper[5200]: I1126 13:32:27.997230 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-catalog-content\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.002005 5200 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-cpmj7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 13:32:28 crc kubenswrapper[5200]: [-]has-synced failed: reason withheld Nov 26 13:32:28 crc kubenswrapper[5200]: [+]process-running ok Nov 26 13:32:28 crc kubenswrapper[5200]: healthz check failed Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.002104 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podUID="5022e6ca-d788-42fc-80ee-c4b3458b22f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.010292 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.510242755 +0000 UTC m=+117.189444573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.051613 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.052118 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.552091746 +0000 UTC m=+117.231293564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.070737 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fs86m"] Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.146640 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flc5h\" (UniqueName: \"kubernetes.io/projected/65b299e0-e301-40f7-a74c-d7403b784d36-kube-api-access-flc5h\") pod \"certified-operators-8vhwf\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.148126 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.150362 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fs86m"] Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.156729 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-catalog-content\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.156808 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-utilities\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.156857 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzw5d\" (UniqueName: \"kubernetes.io/projected/46257d12-db68-4b03-8ba0-d097f5da18ab-kube-api-access-rzw5d\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.156932 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.157360 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.65734166 +0000 UTC m=+117.336543478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.193659 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xskxw"] Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.207388 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.211532 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xskxw"] Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.246644 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.259650 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.259898 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-utilities\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.260056 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.760020183 +0000 UTC m=+117.439222001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.260142 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-catalog-content\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.260208 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-catalog-content\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.260283 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-utilities\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.260311 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxxl2\" (UniqueName: \"kubernetes.io/projected/527846e5-1964-47c9-97ab-a2a1cb70074a-kube-api-access-xxxl2\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.260338 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzw5d\" (UniqueName: \"kubernetes.io/projected/46257d12-db68-4b03-8ba0-d097f5da18ab-kube-api-access-rzw5d\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.261104 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-catalog-content\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.261377 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-utilities\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.286414 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.297712 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.302580 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler\"/\"kube-root-ca.crt\"" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.302724 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler\"/\"installer-sa-dockercfg-qpkss\"" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.316839 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzw5d\" (UniqueName: \"kubernetes.io/projected/46257d12-db68-4b03-8ba0-d097f5da18ab-kube-api-access-rzw5d\") pod \"community-operators-fs86m\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.371917 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxxl2\" (UniqueName: \"kubernetes.io/projected/527846e5-1964-47c9-97ab-a2a1cb70074a-kube-api-access-xxxl2\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.372358 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-utilities\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.372384 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.372430 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-catalog-content\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.372462 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.372505 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.372981 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-catalog-content\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.373036 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.873014125 +0000 UTC m=+117.552215943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.373290 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-utilities\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.423086 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxxl2\" (UniqueName: \"kubernetes.io/projected/527846e5-1964-47c9-97ab-a2a1cb70074a-kube-api-access-xxxl2\") pod \"certified-operators-xskxw\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.434051 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.475484 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.475755 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.475805 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.476143 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:28.976095328 +0000 UTC m=+117.655297146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.476269 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.481961 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.503954 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.537507 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djxct"] Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.560241 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.577299 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.577802 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.077785664 +0000 UTC m=+117.756987482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.673976 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.678583 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.679317 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.179272823 +0000 UTC m=+117.858474661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.787240 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.788201 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.288184349 +0000 UTC m=+117.967386167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.874840 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vhwf"] Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.890529 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.891056 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.391036927 +0000 UTC m=+118.070238735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.901891 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djxct" event={"ID":"08ffe275-d9f0-491b-bc0a-ae428458c956","Type":"ContainerStarted","Data":"6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e"} Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.901941 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djxct" event={"ID":"08ffe275-d9f0-491b-bc0a-ae428458c956","Type":"ContainerStarted","Data":"88322b41c70e7aba7830c95b94424a0d6724b61215357c1cd9cb00408b1dab7a"} Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.913580 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.989282 5200 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-cpmj7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 13:32:28 crc kubenswrapper[5200]: [-]has-synced failed: reason withheld Nov 26 13:32:28 crc kubenswrapper[5200]: [+]process-running ok Nov 26 13:32:28 crc kubenswrapper[5200]: healthz check failed Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.990833 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podUID="5022e6ca-d788-42fc-80ee-c4b3458b22f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:28 crc kubenswrapper[5200]: I1126 13:32:28.992403 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:28 crc kubenswrapper[5200]: E1126 13:32:28.993790 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.493774461 +0000 UTC m=+118.172976279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.093796 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.094225 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.59418257 +0000 UTC m=+118.273384388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.094486 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.094999 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.594981212 +0000 UTC m=+118.274183030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.174329 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.175295 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.182521 5200 patch_prober.go:28] interesting pod/console-64d44f6ddf-vvkzj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.182561 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d44f6ddf-vvkzj" podUID="5e4ba348-184c-436b-87ae-2fddc33c378e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.195668 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fs86m"] Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.196236 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.198287 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.698270272 +0000 UTC m=+118.377472090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.199347 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.232247 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.242786 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-9ddfb9f55-54qvp" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.300630 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.301646 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.801622574 +0000 UTC m=+118.480824392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.318105 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.403515 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-secret-volume\") pod \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.403710 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slrfd\" (UniqueName: \"kubernetes.io/projected/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-kube-api-access-slrfd\") pod \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.403742 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-config-volume\") pod \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\" (UID: \"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b\") " Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.403976 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.404679 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:29.904639366 +0000 UTC m=+118.583841184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.405280 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-config-volume" (OuterVolumeSpecName: "config-volume") pod "55d5f3d3-1ccb-4a17-88ba-c76273c02a8b" (UID: "55d5f3d3-1ccb-4a17-88ba-c76273c02a8b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.405610 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xskxw"] Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.433019 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "55d5f3d3-1ccb-4a17-88ba-c76273c02a8b" (UID: "55d5f3d3-1ccb-4a17-88ba-c76273c02a8b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.455943 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-kube-api-access-slrfd" (OuterVolumeSpecName: "kube-api-access-slrfd") pod "55d5f3d3-1ccb-4a17-88ba-c76273c02a8b" (UID: "55d5f3d3-1ccb-4a17-88ba-c76273c02a8b"). InnerVolumeSpecName "kube-api-access-slrfd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.508793 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.508942 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.508957 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-slrfd\" (UniqueName: \"kubernetes.io/projected/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-kube-api-access-slrfd\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.508966 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.509327 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.009310254 +0000 UTC m=+118.688512072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.609878 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.610528 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.110502796 +0000 UTC m=+118.789704614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.711638 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.712088 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.212071927 +0000 UTC m=+118.891273745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.794061 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t7dzl"] Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.794818 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55d5f3d3-1ccb-4a17-88ba-c76273c02a8b" containerName="collect-profiles" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.794838 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d5f3d3-1ccb-4a17-88ba-c76273c02a8b" containerName="collect-profiles" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.794955 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="55d5f3d3-1ccb-4a17-88ba-c76273c02a8b" containerName="collect-profiles" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.800218 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.805560 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.809548 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7dzl"] Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.814092 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.814741 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.314715879 +0000 UTC m=+118.993917697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.920035 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.923611 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-utilities\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.923774 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvx89\" (UniqueName: \"kubernetes.io/projected/a8c12d33-9b04-4eab-9f47-289246106a6f-kube-api-access-jvx89\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.923878 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-catalog-content\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:29 crc kubenswrapper[5200]: E1126 13:32:29.924371 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.424355677 +0000 UTC m=+119.103557495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.925579 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ctstk" event={"ID":"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba","Type":"ContainerStarted","Data":"214f354786c63c05730b3c216f233ac24214b9df5c3a76f75789a56c06e9fdf3"} Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.931836 5200 generic.go:358] "Generic (PLEG): container finished" podID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerID="6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e" exitCode=0 Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.931931 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djxct" event={"ID":"08ffe275-d9f0-491b-bc0a-ae428458c956","Type":"ContainerDied","Data":"6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e"} Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.934447 5200 generic.go:358] "Generic (PLEG): container finished" podID="65b299e0-e301-40f7-a74c-d7403b784d36" containerID="98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770" exitCode=0 Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.934560 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhwf" event={"ID":"65b299e0-e301-40f7-a74c-d7403b784d36","Type":"ContainerDied","Data":"98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770"} Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.934589 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhwf" event={"ID":"65b299e0-e301-40f7-a74c-d7403b784d36","Type":"ContainerStarted","Data":"66756543a65080fcaedd6021ef3c750ff483d963545925747d12306dcc8f476a"} Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.936043 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs86m" event={"ID":"46257d12-db68-4b03-8ba0-d097f5da18ab","Type":"ContainerStarted","Data":"e4b66cbcf42fc889a079618bd1995c2a579d63659ff960e1a1f2a46ebd5f4f6f"} Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.939598 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"bd301e5f-508d-4855-8d8e-7c57fe6f0786","Type":"ContainerStarted","Data":"848752a59ae236bad20c2931be14661d58f5b5cae33331cc0d80ad47b9249a61"} Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.945061 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" event={"ID":"55d5f3d3-1ccb-4a17-88ba-c76273c02a8b","Type":"ContainerDied","Data":"19b99eaa5ebfb7580ff843142da87c31eb6a9861df4f2062d34fa63d7cb81d2d"} Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.945091 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b99eaa5ebfb7580ff843142da87c31eb6a9861df4f2062d34fa63d7cb81d2d" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.945205 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr" Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.957590 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskxw" event={"ID":"527846e5-1964-47c9-97ab-a2a1cb70074a","Type":"ContainerStarted","Data":"8a261f10996a83491611bf00deff26eb0890d3655925d12b89b12658ddfd97ee"} Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.981725 5200 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-cpmj7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 13:32:29 crc kubenswrapper[5200]: [-]has-synced failed: reason withheld Nov 26 13:32:29 crc kubenswrapper[5200]: [+]process-running ok Nov 26 13:32:29 crc kubenswrapper[5200]: healthz check failed Nov 26 13:32:29 crc kubenswrapper[5200]: I1126 13:32:29.981801 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podUID="5022e6ca-d788-42fc-80ee-c4b3458b22f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.024649 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.024960 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-utilities\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.025049 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvx89\" (UniqueName: \"kubernetes.io/projected/a8c12d33-9b04-4eab-9f47-289246106a6f-kube-api-access-jvx89\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.025155 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-catalog-content\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.026437 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.526405992 +0000 UTC m=+119.205607810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.027422 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-utilities\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.028463 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-catalog-content\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.064680 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvx89\" (UniqueName: \"kubernetes.io/projected/a8c12d33-9b04-4eab-9f47-289246106a6f-kube-api-access-jvx89\") pod \"redhat-marketplace-t7dzl\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.129582 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.129972 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.629954779 +0000 UTC m=+119.309156597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.166198 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bpb"] Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.171735 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.181170 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bpb"] Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.221278 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.230654 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.230993 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-utilities\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.231034 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45sd\" (UniqueName: \"kubernetes.io/projected/b360c828-8ede-436a-8399-b14caf072e00-kube-api-access-s45sd\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.231150 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-catalog-content\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.231259 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.731242563 +0000 UTC m=+119.410444381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.332334 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-utilities\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.332605 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s45sd\" (UniqueName: \"kubernetes.io/projected/b360c828-8ede-436a-8399-b14caf072e00-kube-api-access-s45sd\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.332701 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.332766 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-catalog-content\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.333294 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-utilities\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.333618 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.833603976 +0000 UTC m=+119.512805784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.336403 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-catalog-content\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.354408 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45sd\" (UniqueName: \"kubernetes.io/projected/b360c828-8ede-436a-8399-b14caf072e00-kube-api-access-s45sd\") pod \"redhat-marketplace-x5bpb\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.433974 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.434349 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:30.934331855 +0000 UTC m=+119.613533673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.485271 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7dzl"] Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.503528 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.546583 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.546953 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.046940225 +0000 UTC m=+119.726142033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.648478 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.649261 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.149237627 +0000 UTC m=+119.828439445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.754728 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.755083 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.255071018 +0000 UTC m=+119.934272836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.776457 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cdfrr"] Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.783923 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.792401 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdfrr"] Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.792897 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.816803 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bpb"] Nov 26 13:32:30 crc kubenswrapper[5200]: W1126 13:32:30.839220 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb360c828_8ede_436a_8399_b14caf072e00.slice/crio-791e6f7190b4015dff19601a292ed06492c85af9c78622957c1eff7d1e57af43 WatchSource:0}: Error finding container 791e6f7190b4015dff19601a292ed06492c85af9c78622957c1eff7d1e57af43: Status 404 returned error can't find the container with id 791e6f7190b4015dff19601a292ed06492c85af9c78622957c1eff7d1e57af43 Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.857162 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.857512 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-catalog-content\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.858342 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.358312857 +0000 UTC m=+120.037514675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.860910 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fszn\" (UniqueName: \"kubernetes.io/projected/aaf837ae-040e-49b2-afff-9c189bbe80e8-kube-api-access-6fszn\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.861090 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.861194 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-utilities\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.861706 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.361672041 +0000 UTC m=+120.040873859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.962722 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.963294 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-catalog-content\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.963344 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fszn\" (UniqueName: \"kubernetes.io/projected/aaf837ae-040e-49b2-afff-9c189bbe80e8-kube-api-access-6fszn\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.963395 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-utilities\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.963821 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-utilities\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.964206 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-catalog-content\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:30 crc kubenswrapper[5200]: E1126 13:32:30.964215 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.464198139 +0000 UTC m=+120.143399957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.970252 5200 generic.go:358] "Generic (PLEG): container finished" podID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerID="7b46d05fd6c07fe6e796d894b0a275bdb9be268d58b2387d970e4e2e556a23af" exitCode=0 Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.976803 5200 generic.go:358] "Generic (PLEG): container finished" podID="bd301e5f-508d-4855-8d8e-7c57fe6f0786" containerID="19469324d278ec2177bdac473c6d8c8af87ae177815efa1b30b9c80d939a705b" exitCode=0 Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.980457 5200 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-cpmj7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 13:32:30 crc kubenswrapper[5200]: [-]has-synced failed: reason withheld Nov 26 13:32:30 crc kubenswrapper[5200]: [+]process-running ok Nov 26 13:32:30 crc kubenswrapper[5200]: healthz check failed Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.980735 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podUID="5022e6ca-d788-42fc-80ee-c4b3458b22f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.989620 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs86m" event={"ID":"46257d12-db68-4b03-8ba0-d097f5da18ab","Type":"ContainerDied","Data":"7b46d05fd6c07fe6e796d894b0a275bdb9be268d58b2387d970e4e2e556a23af"} Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.989662 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"bd301e5f-508d-4855-8d8e-7c57fe6f0786","Type":"ContainerDied","Data":"19469324d278ec2177bdac473c6d8c8af87ae177815efa1b30b9c80d939a705b"} Nov 26 13:32:30 crc kubenswrapper[5200]: I1126 13:32:30.989674 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bpb" event={"ID":"b360c828-8ede-436a-8399-b14caf072e00","Type":"ContainerStarted","Data":"791e6f7190b4015dff19601a292ed06492c85af9c78622957c1eff7d1e57af43"} Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.005073 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fszn\" (UniqueName: \"kubernetes.io/projected/aaf837ae-040e-49b2-afff-9c189bbe80e8-kube-api-access-6fszn\") pod \"redhat-operators-cdfrr\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.006597 5200 generic.go:358] "Generic (PLEG): container finished" podID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerID="0c4893b16874fb3bcc59dbc7642d13eefb4346545b775ce9d453c3e38d1c0a01" exitCode=0 Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.006811 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskxw" event={"ID":"527846e5-1964-47c9-97ab-a2a1cb70074a","Type":"ContainerDied","Data":"0c4893b16874fb3bcc59dbc7642d13eefb4346545b775ce9d453c3e38d1c0a01"} Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.027111 5200 generic.go:358] "Generic (PLEG): container finished" podID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerID="347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81" exitCode=0 Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.027497 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7dzl" event={"ID":"a8c12d33-9b04-4eab-9f47-289246106a6f","Type":"ContainerDied","Data":"347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81"} Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.027600 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7dzl" event={"ID":"a8c12d33-9b04-4eab-9f47-289246106a6f","Type":"ContainerStarted","Data":"1e0c479495a567c7d9c9b69a933be74573280e99f85b17e26c22e56fcd24d1ac"} Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.064781 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.066071 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.566054319 +0000 UTC m=+120.245256137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.126624 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.168486 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.168618 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.668591818 +0000 UTC m=+120.347793636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.169705 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.170379 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.670370537 +0000 UTC m=+120.349572355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.175523 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bfdtt"] Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.204570 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.208914 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bfdtt"] Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.274073 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.274501 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htgk\" (UniqueName: \"kubernetes.io/projected/68d9317f-7f72-4ac5-babc-bf891defbbd9-kube-api-access-9htgk\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.274560 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-utilities\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.274631 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-catalog-content\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.277484 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.777279748 +0000 UTC m=+120.456481566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.381734 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.382315 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.882291306 +0000 UTC m=+120.561493124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.382472 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9htgk\" (UniqueName: \"kubernetes.io/projected/68d9317f-7f72-4ac5-babc-bf891defbbd9-kube-api-access-9htgk\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.382573 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-utilities\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.382704 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-catalog-content\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.383610 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-utilities\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.397859 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-catalog-content\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.407007 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htgk\" (UniqueName: \"kubernetes.io/projected/68d9317f-7f72-4ac5-babc-bf891defbbd9-kube-api-access-9htgk\") pod \"redhat-operators-bfdtt\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.477039 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdfrr"] Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.485311 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.485497 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.985462763 +0000 UTC m=+120.664664581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.485959 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.486418 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:31.986408619 +0000 UTC m=+120.665610437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.526943 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.587308 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.587476 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.087447006 +0000 UTC m=+120.766648824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.588294 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.594165 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.094142443 +0000 UTC m=+120.773344251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.690248 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.690453 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.190418547 +0000 UTC m=+120.869620365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.690876 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.691270 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.191262861 +0000 UTC m=+120.870464669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.793219 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.793591 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.293555763 +0000 UTC m=+120.972757581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.793891 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.795650 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.295639991 +0000 UTC m=+120.974841809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.855351 5200 patch_prober.go:28] interesting pod/downloads-747b44746d-5sjbk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.855410 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-747b44746d-5sjbk" podUID="1201bf36-1a87-438f-9140-a4c310e347ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.902629 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.902836 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.402806029 +0000 UTC m=+121.082007857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.903253 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:31 crc kubenswrapper[5200]: E1126 13:32:31.903616 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.403603431 +0000 UTC m=+121.082805249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.976793 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.981969 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bfdtt"] Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.984344 5200 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-cpmj7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 13:32:31 crc kubenswrapper[5200]: [-]has-synced failed: reason withheld Nov 26 13:32:31 crc kubenswrapper[5200]: [+]process-running ok Nov 26 13:32:31 crc kubenswrapper[5200]: healthz check failed Nov 26 13:32:31 crc kubenswrapper[5200]: I1126 13:32:31.984399 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podUID="5022e6ca-d788-42fc-80ee-c4b3458b22f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.004817 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.005004 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.504972118 +0000 UTC m=+121.184173936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: W1126 13:32:32.005011 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d9317f_7f72_4ac5_babc_bf891defbbd9.slice/crio-a37d48826d3816dfff5eddb654e1948086b4c2642de15eae2fb9eeef1244cbb0 WatchSource:0}: Error finding container a37d48826d3816dfff5eddb654e1948086b4c2642de15eae2fb9eeef1244cbb0: Status 404 returned error can't find the container with id a37d48826d3816dfff5eddb654e1948086b4c2642de15eae2fb9eeef1244cbb0 Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.006093 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.007336 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.507320813 +0000 UTC m=+121.186522631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.038551 5200 generic.go:358] "Generic (PLEG): container finished" podID="b360c828-8ede-436a-8399-b14caf072e00" containerID="30451a5da751385cb27f0f6fa1550a08bc5e0f10ccc2979f9436c8d4b9898b6f" exitCode=0 Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.038664 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bpb" event={"ID":"b360c828-8ede-436a-8399-b14caf072e00","Type":"ContainerDied","Data":"30451a5da751385cb27f0f6fa1550a08bc5e0f10ccc2979f9436c8d4b9898b6f"} Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.054650 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfdtt" event={"ID":"68d9317f-7f72-4ac5-babc-bf891defbbd9","Type":"ContainerStarted","Data":"a37d48826d3816dfff5eddb654e1948086b4c2642de15eae2fb9eeef1244cbb0"} Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.061267 5200 generic.go:358] "Generic (PLEG): container finished" podID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerID="3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a" exitCode=0 Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.061441 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdfrr" event={"ID":"aaf837ae-040e-49b2-afff-9c189bbe80e8","Type":"ContainerDied","Data":"3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a"} Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.061472 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdfrr" event={"ID":"aaf837ae-040e-49b2-afff-9c189bbe80e8","Type":"ContainerStarted","Data":"9b795e45f35205933c1f175f89da04de009bcf8194e6224a098ed9c380b9bc2f"} Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.106943 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.109505 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.609485142 +0000 UTC m=+121.288686960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.209814 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.210189 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.710175339 +0000 UTC m=+121.389377157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.311382 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.311574 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.811547125 +0000 UTC m=+121.490748943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.311964 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.312338 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.812323317 +0000 UTC m=+121.491525135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.351206 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.412927 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.413278 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.913238569 +0000 UTC m=+121.592440387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.413454 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.413988 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:32.913964529 +0000 UTC m=+121.593166347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.515464 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.515648 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kube-api-access\") pod \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\" (UID: \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\") " Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.515730 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kubelet-dir\") pod \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\" (UID: \"bd301e5f-508d-4855-8d8e-7c57fe6f0786\") " Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.516213 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bd301e5f-508d-4855-8d8e-7c57fe6f0786" (UID: "bd301e5f-508d-4855-8d8e-7c57fe6f0786"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.516300 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.016283292 +0000 UTC m=+121.695485110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.532955 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bd301e5f-508d-4855-8d8e-7c57fe6f0786" (UID: "bd301e5f-508d-4855-8d8e-7c57fe6f0786"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.618274 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.618426 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.618441 5200 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd301e5f-508d-4855-8d8e-7c57fe6f0786-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.618774 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.118753359 +0000 UTC m=+121.797955217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.671271 5200 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.719885 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.720094 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.220054553 +0000 UTC m=+121.899256371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.720372 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.720889 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.220880116 +0000 UTC m=+121.900081924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.825551 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.825934 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.325909895 +0000 UTC m=+122.005111713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.928535 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:32 crc kubenswrapper[5200]: E1126 13:32:32.928886 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.428873355 +0000 UTC m=+122.108075173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.978438 5200 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-cpmj7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 13:32:32 crc kubenswrapper[5200]: [-]has-synced failed: reason withheld Nov 26 13:32:32 crc kubenswrapper[5200]: [+]process-running ok Nov 26 13:32:32 crc kubenswrapper[5200]: healthz check failed Nov 26 13:32:32 crc kubenswrapper[5200]: I1126 13:32:32.978772 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" podUID="5022e6ca-d788-42fc-80ee-c4b3458b22f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.030180 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:33 crc kubenswrapper[5200]: E1126 13:32:33.030432 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.530388195 +0000 UTC m=+122.209590023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.030583 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.030639 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.030856 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:33 crc kubenswrapper[5200]: E1126 13:32:33.031010 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2025-11-26 13:32:33.530992872 +0000 UTC m=+122.210194690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-k5pvc" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.031092 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.031120 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.032381 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.037877 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.038559 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.045605 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.087483 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ctstk" event={"ID":"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba","Type":"ContainerStarted","Data":"ba2528cb3cde1c4612932ed67fac50f5ea3b4a5011db8e6a8ab95ca8509d3eda"} Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.087563 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ctstk" event={"ID":"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba","Type":"ContainerStarted","Data":"e0a87216bbbc0a185a47994c580c8e186ccf3ad9c789f38d1c8d32cf47797071"} Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.090409 5200 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-26T13:32:32.671310019Z","UUID":"4fc8c676-8d03-4e5b-9741-e8ebdacf09bc","Handler":null,"Name":"","Endpoint":""} Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.093761 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"bd301e5f-508d-4855-8d8e-7c57fe6f0786","Type":"ContainerDied","Data":"848752a59ae236bad20c2931be14661d58f5b5cae33331cc0d80ad47b9249a61"} Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.093805 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="848752a59ae236bad20c2931be14661d58f5b5cae33331cc0d80ad47b9249a61" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.093965 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.100570 5200 generic.go:358] "Generic (PLEG): container finished" podID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerID="a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f" exitCode=0 Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.100723 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfdtt" event={"ID":"68d9317f-7f72-4ac5-babc-bf891defbbd9","Type":"ContainerDied","Data":"a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f"} Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.113981 5200 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.114036 5200 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.134184 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.138771 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (OuterVolumeSpecName: "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2". PluginName "kubernetes.io/csi", VolumeGIDValue "" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.236545 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.246254 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.246306 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1264ac67579ad07e7e9003054d44fe40dd55285a4b2f7dc74e48be1aee0868a/globalmount\"" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.271944 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-k5pvc\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.296002 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.305239 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.311982 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.319263 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.660388 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-k5pvc"] Nov 26 13:32:33 crc kubenswrapper[5200]: W1126 13:32:33.672156 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae19c49_1c67_4da5_b8a0_da6a4fd4f2fa.slice/crio-8133525f49bd2c01485a588810e93109ccd0e3081f2c8fc5a8899887f8fd03d5 WatchSource:0}: Error finding container 8133525f49bd2c01485a588810e93109ccd0e3081f2c8fc5a8899887f8fd03d5: Status 404 returned error can't find the container with id 8133525f49bd2c01485a588810e93109ccd0e3081f2c8fc5a8899887f8fd03d5 Nov 26 13:32:33 crc kubenswrapper[5200]: W1126 13:32:33.808515 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a9ae5f6_97bd_46ac_bafa_ca1b4452a141.slice/crio-27b67c157a3e2e3b9b4097b7b8258a7ba31420f1ac723dfe9e1d9e2f1b6fd969 WatchSource:0}: Error finding container 27b67c157a3e2e3b9b4097b7b8258a7ba31420f1ac723dfe9e1d9e2f1b6fd969: Status 404 returned error can't find the container with id 27b67c157a3e2e3b9b4097b7b8258a7ba31420f1ac723dfe9e1d9e2f1b6fd969 Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.978837 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:33 crc kubenswrapper[5200]: I1126 13:32:33.981973 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68cf44c8b8-cpmj7" Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.111307 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" event={"ID":"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141","Type":"ContainerStarted","Data":"9e18ba83f34cb19d84221e58564fd19f7a752f3efa5e5fafb7e5c3485831522a"} Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.111751 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" event={"ID":"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141","Type":"ContainerStarted","Data":"27b67c157a3e2e3b9b4097b7b8258a7ba31420f1ac723dfe9e1d9e2f1b6fd969"} Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.117832 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ctstk" event={"ID":"1e6e5c1a-c017-4da5-bef8-d21f2c0b8dba","Type":"ContainerStarted","Data":"2c0a429980cdb467ab593d00f545a625a5d3cce975c04d99dafea5f78cd97b4c"} Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.122945 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" event={"ID":"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa","Type":"ContainerStarted","Data":"a1e44980bc0b4befff3a2cc4569d65ef710ef19fa6c0f5e45c6bf4eecacfb055"} Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.123021 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" event={"ID":"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa","Type":"ContainerStarted","Data":"8133525f49bd2c01485a588810e93109ccd0e3081f2c8fc5a8899887f8fd03d5"} Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.124246 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.138561 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" event={"ID":"f863fff9-286a-45fa-b8f0-8a86994b8440","Type":"ContainerStarted","Data":"9302d75eb7948e680a9abbf0f72201df242511b68a4559f8afd4250d8833ced1"} Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.156113 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ctstk" podStartSLOduration=17.15609494 podStartE2EDuration="17.15609494s" podCreationTimestamp="2025-11-26 13:32:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:34.15501145 +0000 UTC m=+122.834213268" watchObservedRunningTime="2025-11-26 13:32:34.15609494 +0000 UTC m=+122.835296748" Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.167283 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" event={"ID":"17b87002-b798-480a-8e17-83053d698239","Type":"ContainerStarted","Data":"9518c5849bbee50f0d98bddc977c68a9030653abe825194b76d8a78a3f593658"} Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.178254 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" podStartSLOduration=32.178214239 podStartE2EDuration="32.178214239s" podCreationTimestamp="2025-11-26 13:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:34.176058889 +0000 UTC m=+122.855260717" watchObservedRunningTime="2025-11-26 13:32:34.178214239 +0000 UTC m=+122.857416067" Nov 26 13:32:34 crc kubenswrapper[5200]: I1126 13:32:34.982962 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9b5059-1b3e-4067-a63d-2952cbe863af" path="/var/lib/kubelet/pods/9e9b5059-1b3e-4067-a63d-2952cbe863af/volumes" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.178119 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" event={"ID":"f863fff9-286a-45fa-b8f0-8a86994b8440","Type":"ContainerStarted","Data":"f8b209c36ce76877d7712618c66120d25c78e7cf09efe4a9f8528fc249eb53fc"} Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.182173 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" event={"ID":"17b87002-b798-480a-8e17-83053d698239","Type":"ContainerStarted","Data":"e758bd42c4bca36029f1f9e8fe0af28e3ba5ad51eddff9516aabe20be5ba8458"} Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.200873 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:32:35 crc kubenswrapper[5200]: E1126 13:32:35.215737 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:35 crc kubenswrapper[5200]: E1126 13:32:35.233061 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:35 crc kubenswrapper[5200]: E1126 13:32:35.249508 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:35 crc kubenswrapper[5200]: E1126 13:32:35.249629 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" podUID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.330125 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.330829 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd301e5f-508d-4855-8d8e-7c57fe6f0786" containerName="pruner" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.330851 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd301e5f-508d-4855-8d8e-7c57fe6f0786" containerName="pruner" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.330965 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd301e5f-508d-4855-8d8e-7c57fe6f0786" containerName="pruner" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.774599 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.774845 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.777948 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.778563 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.885363 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00ba9854-d4d7-4429-99c8-38322fdd8f61-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"00ba9854-d4d7-4429-99c8-38322fdd8f61\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.885435 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00ba9854-d4d7-4429-99c8-38322fdd8f61-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"00ba9854-d4d7-4429-99c8-38322fdd8f61\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.994392 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00ba9854-d4d7-4429-99c8-38322fdd8f61-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"00ba9854-d4d7-4429-99c8-38322fdd8f61\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.994587 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00ba9854-d4d7-4429-99c8-38322fdd8f61-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"00ba9854-d4d7-4429-99c8-38322fdd8f61\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:35 crc kubenswrapper[5200]: I1126 13:32:35.994921 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00ba9854-d4d7-4429-99c8-38322fdd8f61-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"00ba9854-d4d7-4429-99c8-38322fdd8f61\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:36 crc kubenswrapper[5200]: I1126 13:32:36.019769 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00ba9854-d4d7-4429-99c8-38322fdd8f61-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"00ba9854-d4d7-4429-99c8-38322fdd8f61\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:36 crc kubenswrapper[5200]: I1126 13:32:36.105471 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:37 crc kubenswrapper[5200]: I1126 13:32:37.876289 5200 patch_prober.go:28] interesting pod/downloads-747b44746d-5sjbk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Nov 26 13:32:37 crc kubenswrapper[5200]: I1126 13:32:37.877485 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-5sjbk" podUID="1201bf36-1a87-438f-9140-a4c310e347ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Nov 26 13:32:37 crc kubenswrapper[5200]: I1126 13:32:37.908020 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lpc6x" Nov 26 13:32:39 crc kubenswrapper[5200]: I1126 13:32:39.169518 5200 patch_prober.go:28] interesting pod/console-64d44f6ddf-vvkzj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Nov 26 13:32:39 crc kubenswrapper[5200]: I1126 13:32:39.170039 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d44f6ddf-vvkzj" podUID="5e4ba348-184c-436b-87ae-2fddc33c378e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Nov 26 13:32:43 crc kubenswrapper[5200]: I1126 13:32:43.339560 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:32:45 crc kubenswrapper[5200]: E1126 13:32:45.204439 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:45 crc kubenswrapper[5200]: E1126 13:32:45.209106 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:45 crc kubenswrapper[5200]: E1126 13:32:45.211251 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:45 crc kubenswrapper[5200]: E1126 13:32:45.211516 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" podUID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Nov 26 13:32:47 crc kubenswrapper[5200]: I1126 13:32:47.879596 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-747b44746d-5sjbk" Nov 26 13:32:49 crc kubenswrapper[5200]: I1126 13:32:49.491789 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:49 crc kubenswrapper[5200]: I1126 13:32:49.499131 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:32:55 crc kubenswrapper[5200]: E1126 13:32:55.204939 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:55 crc kubenswrapper[5200]: E1126 13:32:55.209242 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:55 crc kubenswrapper[5200]: E1126 13:32:55.211198 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 13:32:55 crc kubenswrapper[5200]: E1126 13:32:55.211342 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" podUID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Nov 26 13:32:56 crc kubenswrapper[5200]: I1126 13:32:56.195886 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:32:56 crc kubenswrapper[5200]: I1126 13:32:56.601074 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Nov 26 13:32:56 crc kubenswrapper[5200]: W1126 13:32:56.606159 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod00ba9854_d4d7_4429_99c8_38322fdd8f61.slice/crio-980b9bf7211a58e1f092d84e0b77407aeba605b62a9e9ccc1aa79700d90d8c74 WatchSource:0}: Error finding container 980b9bf7211a58e1f092d84e0b77407aeba605b62a9e9ccc1aa79700d90d8c74: Status 404 returned error can't find the container with id 980b9bf7211a58e1f092d84e0b77407aeba605b62a9e9ccc1aa79700d90d8c74 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.339711 5200 generic.go:358] "Generic (PLEG): container finished" podID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerID="5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a" exitCode=0 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.339807 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djxct" event={"ID":"08ffe275-d9f0-491b-bc0a-ae428458c956","Type":"ContainerDied","Data":"5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.342087 5200 generic.go:358] "Generic (PLEG): container finished" podID="65b299e0-e301-40f7-a74c-d7403b784d36" containerID="abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250" exitCode=0 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.342197 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhwf" event={"ID":"65b299e0-e301-40f7-a74c-d7403b784d36","Type":"ContainerDied","Data":"abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.345195 5200 generic.go:358] "Generic (PLEG): container finished" podID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerID="339fde4cb529ce6cb50086584205f864510acc9458e186b3e46f48227b91ff5f" exitCode=0 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.345465 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs86m" event={"ID":"46257d12-db68-4b03-8ba0-d097f5da18ab","Type":"ContainerDied","Data":"339fde4cb529ce6cb50086584205f864510acc9458e186b3e46f48227b91ff5f"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.347617 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"00ba9854-d4d7-4429-99c8-38322fdd8f61","Type":"ContainerStarted","Data":"f5092a1ac593702117f7617baa21c70ab5158c121237bcad942bff5dc32c11df"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.347652 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"00ba9854-d4d7-4429-99c8-38322fdd8f61","Type":"ContainerStarted","Data":"980b9bf7211a58e1f092d84e0b77407aeba605b62a9e9ccc1aa79700d90d8c74"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.349240 5200 generic.go:358] "Generic (PLEG): container finished" podID="b360c828-8ede-436a-8399-b14caf072e00" containerID="4a696494b883ae785d71c6c3cfc00da08a1bfaf86b1b8fdc441694fc46f119d0" exitCode=0 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.349403 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bpb" event={"ID":"b360c828-8ede-436a-8399-b14caf072e00","Type":"ContainerDied","Data":"4a696494b883ae785d71c6c3cfc00da08a1bfaf86b1b8fdc441694fc46f119d0"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.353050 5200 generic.go:358] "Generic (PLEG): container finished" podID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerID="45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a" exitCode=0 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.353166 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfdtt" event={"ID":"68d9317f-7f72-4ac5-babc-bf891defbbd9","Type":"ContainerDied","Data":"45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.356808 5200 generic.go:358] "Generic (PLEG): container finished" podID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerID="72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704" exitCode=0 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.357006 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdfrr" event={"ID":"aaf837ae-040e-49b2-afff-9c189bbe80e8","Type":"ContainerDied","Data":"72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.360394 5200 generic.go:358] "Generic (PLEG): container finished" podID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerID="545445d95788daa57a7ba08572d90ef60785649b98053e45c5b18683e83d2926" exitCode=0 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.361017 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskxw" event={"ID":"527846e5-1964-47c9-97ab-a2a1cb70074a","Type":"ContainerDied","Data":"545445d95788daa57a7ba08572d90ef60785649b98053e45c5b18683e83d2926"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.378274 5200 generic.go:358] "Generic (PLEG): container finished" podID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerID="446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6" exitCode=0 Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.378393 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7dzl" event={"ID":"a8c12d33-9b04-4eab-9f47-289246106a6f","Type":"ContainerDied","Data":"446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6"} Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.394163 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-11-crc" podStartSLOduration=22.394145176 podStartE2EDuration="22.394145176s" podCreationTimestamp="2025-11-26 13:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:32:57.391886005 +0000 UTC m=+146.071087833" watchObservedRunningTime="2025-11-26 13:32:57.394145176 +0000 UTC m=+146.073346994" Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.978447 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-rnv6n_f5a6b53d-f6d4-47c3-96e6-a20b6fec1686/kube-multus-additional-cni-plugins/0.log" Nov 26 13:32:57 crc kubenswrapper[5200]: I1126 13:32:57.978811 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.088740 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-cni-sysctl-allowlist\") pod \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.088808 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wpt4\" (UniqueName: \"kubernetes.io/projected/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-kube-api-access-5wpt4\") pod \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.088910 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-ready\") pod \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.089034 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-tuning-conf-dir\") pod \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\" (UID: \"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686\") " Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.089256 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" (UID: "f5a6b53d-f6d4-47c3-96e6-a20b6fec1686"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.090242 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" (UID: "f5a6b53d-f6d4-47c3-96e6-a20b6fec1686"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.090431 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-ready" (OuterVolumeSpecName: "ready") pod "f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" (UID: "f5a6b53d-f6d4-47c3-96e6-a20b6fec1686"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.108164 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-kube-api-access-5wpt4" (OuterVolumeSpecName: "kube-api-access-5wpt4") pod "f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" (UID: "f5a6b53d-f6d4-47c3-96e6-a20b6fec1686"). InnerVolumeSpecName "kube-api-access-5wpt4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.190566 5200 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.190626 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5wpt4\" (UniqueName: \"kubernetes.io/projected/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-kube-api-access-5wpt4\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.190639 5200 reconciler_common.go:299] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-ready\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.190652 5200 reconciler_common.go:299] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.385124 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdfrr" event={"ID":"aaf837ae-040e-49b2-afff-9c189bbe80e8","Type":"ContainerStarted","Data":"d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.387338 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskxw" event={"ID":"527846e5-1964-47c9-97ab-a2a1cb70074a","Type":"ContainerStarted","Data":"34356272cb6a1a1d716568a895df5ecc8edfe779ae69040fb5230677815eb8e5"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.389290 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7dzl" event={"ID":"a8c12d33-9b04-4eab-9f47-289246106a6f","Type":"ContainerStarted","Data":"ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.391534 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djxct" event={"ID":"08ffe275-d9f0-491b-bc0a-ae428458c956","Type":"ContainerStarted","Data":"8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.393703 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhwf" event={"ID":"65b299e0-e301-40f7-a74c-d7403b784d36","Type":"ContainerStarted","Data":"828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.395233 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-rnv6n_f5a6b53d-f6d4-47c3-96e6-a20b6fec1686/kube-multus-additional-cni-plugins/0.log" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.395282 5200 generic.go:358] "Generic (PLEG): container finished" podID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" exitCode=137 Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.395378 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" event={"ID":"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686","Type":"ContainerDied","Data":"281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.395418 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" event={"ID":"f5a6b53d-f6d4-47c3-96e6-a20b6fec1686","Type":"ContainerDied","Data":"37c444c31037429a6d335cc2ad1e4fcc2499a842a17237671cc3b4f40bdc444c"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.395378 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rnv6n" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.395433 5200 scope.go:117] "RemoveContainer" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.417179 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs86m" event={"ID":"46257d12-db68-4b03-8ba0-d097f5da18ab","Type":"ContainerStarted","Data":"8a82977cd8792981b3d03563fc68949e2dffd3090407b4e7342353f48cf8b1a5"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.420166 5200 generic.go:358] "Generic (PLEG): container finished" podID="00ba9854-d4d7-4429-99c8-38322fdd8f61" containerID="f5092a1ac593702117f7617baa21c70ab5158c121237bcad942bff5dc32c11df" exitCode=0 Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.420238 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"00ba9854-d4d7-4429-99c8-38322fdd8f61","Type":"ContainerDied","Data":"f5092a1ac593702117f7617baa21c70ab5158c121237bcad942bff5dc32c11df"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.428944 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bpb" event={"ID":"b360c828-8ede-436a-8399-b14caf072e00","Type":"ContainerStarted","Data":"578c5fe47ba948654d01cab81525d7e0686ba814ee276265afff07b94c6ea598"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.433013 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cdfrr" podStartSLOduration=4.35461226 podStartE2EDuration="28.43299858s" podCreationTimestamp="2025-11-26 13:32:30 +0000 UTC" firstStartedPulling="2025-11-26 13:32:32.0622368 +0000 UTC m=+120.741438618" lastFinishedPulling="2025-11-26 13:32:56.14062311 +0000 UTC m=+144.819824938" observedRunningTime="2025-11-26 13:32:58.407136272 +0000 UTC m=+147.086338100" watchObservedRunningTime="2025-11-26 13:32:58.43299858 +0000 UTC m=+147.112200398" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.434332 5200 scope.go:117] "RemoveContainer" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.434859 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.435007 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:32:58 crc kubenswrapper[5200]: E1126 13:32:58.436645 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43\": container with ID starting with 281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43 not found: ID does not exist" containerID="281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.436699 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43"} err="failed to get container status \"281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43\": rpc error: code = NotFound desc = could not find container \"281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43\": container with ID starting with 281b48ad80afd511e1b463eb8479213e564ee6cb2e18a0494ae2fec518dffc43 not found: ID does not exist" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.453563 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xskxw" podStartSLOduration=5.354635209 podStartE2EDuration="30.453532141s" podCreationTimestamp="2025-11-26 13:32:28 +0000 UTC" firstStartedPulling="2025-11-26 13:32:31.007971034 +0000 UTC m=+119.687172842" lastFinishedPulling="2025-11-26 13:32:56.106867956 +0000 UTC m=+144.786069774" observedRunningTime="2025-11-26 13:32:58.434456905 +0000 UTC m=+147.113658743" watchObservedRunningTime="2025-11-26 13:32:58.453532141 +0000 UTC m=+147.132733959" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.454047 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfdtt" event={"ID":"68d9317f-7f72-4ac5-babc-bf891defbbd9","Type":"ContainerStarted","Data":"b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a"} Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.455896 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8vhwf" podStartSLOduration=5.328222978 podStartE2EDuration="31.455886164s" podCreationTimestamp="2025-11-26 13:32:27 +0000 UTC" firstStartedPulling="2025-11-26 13:32:29.936162707 +0000 UTC m=+118.615364525" lastFinishedPulling="2025-11-26 13:32:56.063825893 +0000 UTC m=+144.743027711" observedRunningTime="2025-11-26 13:32:58.453407467 +0000 UTC m=+147.132609285" watchObservedRunningTime="2025-11-26 13:32:58.455886164 +0000 UTC m=+147.135087992" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.473530 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t7dzl" podStartSLOduration=4.438495427 podStartE2EDuration="29.473505144s" podCreationTimestamp="2025-11-26 13:32:29 +0000 UTC" firstStartedPulling="2025-11-26 13:32:31.028863148 +0000 UTC m=+119.708064966" lastFinishedPulling="2025-11-26 13:32:56.063872865 +0000 UTC m=+144.743074683" observedRunningTime="2025-11-26 13:32:58.471357467 +0000 UTC m=+147.150559295" watchObservedRunningTime="2025-11-26 13:32:58.473505144 +0000 UTC m=+147.152706962" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.483536 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.483614 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.506570 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djxct" podStartSLOduration=5.33536253 podStartE2EDuration="31.506551735s" podCreationTimestamp="2025-11-26 13:32:27 +0000 UTC" firstStartedPulling="2025-11-26 13:32:29.933094661 +0000 UTC m=+118.612296479" lastFinishedPulling="2025-11-26 13:32:56.104283866 +0000 UTC m=+144.783485684" observedRunningTime="2025-11-26 13:32:58.502108517 +0000 UTC m=+147.181310355" watchObservedRunningTime="2025-11-26 13:32:58.506551735 +0000 UTC m=+147.185753563" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.523595 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rnv6n"] Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.534842 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rnv6n"] Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.550782 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fs86m" podStartSLOduration=6.417222248 podStartE2EDuration="31.550760755s" podCreationTimestamp="2025-11-26 13:32:27 +0000 UTC" firstStartedPulling="2025-11-26 13:32:30.971900105 +0000 UTC m=+119.651101933" lastFinishedPulling="2025-11-26 13:32:56.105438622 +0000 UTC m=+144.784640440" observedRunningTime="2025-11-26 13:32:58.546544854 +0000 UTC m=+147.225746682" watchObservedRunningTime="2025-11-26 13:32:58.550760755 +0000 UTC m=+147.229962583" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.562169 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.562380 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.573199 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x5bpb" podStartSLOduration=4.549134854 podStartE2EDuration="28.573180705s" podCreationTimestamp="2025-11-26 13:32:30 +0000 UTC" firstStartedPulling="2025-11-26 13:32:32.039785942 +0000 UTC m=+120.718987760" lastFinishedPulling="2025-11-26 13:32:56.063831793 +0000 UTC m=+144.743033611" observedRunningTime="2025-11-26 13:32:58.56948337 +0000 UTC m=+147.248685188" watchObservedRunningTime="2025-11-26 13:32:58.573180705 +0000 UTC m=+147.252382523" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.593394 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bfdtt" podStartSLOduration=4.590839889 podStartE2EDuration="27.593375225s" podCreationTimestamp="2025-11-26 13:32:31 +0000 UTC" firstStartedPulling="2025-11-26 13:32:33.102343919 +0000 UTC m=+121.781545737" lastFinishedPulling="2025-11-26 13:32:56.104879255 +0000 UTC m=+144.784081073" observedRunningTime="2025-11-26 13:32:58.589448653 +0000 UTC m=+147.268650471" watchObservedRunningTime="2025-11-26 13:32:58.593375225 +0000 UTC m=+147.272577043" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.913811 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-mvlbc" Nov 26 13:32:58 crc kubenswrapper[5200]: I1126 13:32:58.994854 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" path="/var/lib/kubelet/pods/f5a6b53d-f6d4-47c3-96e6-a20b6fec1686/volumes" Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.615608 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xskxw" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="registry-server" probeResult="failure" output=< Nov 26 13:32:59 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 13:32:59 crc kubenswrapper[5200]: > Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.616065 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fs86m" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="registry-server" probeResult="failure" output=< Nov 26 13:32:59 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 13:32:59 crc kubenswrapper[5200]: > Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.624992 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8vhwf" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="registry-server" probeResult="failure" output=< Nov 26 13:32:59 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 13:32:59 crc kubenswrapper[5200]: > Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.728486 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.817640 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00ba9854-d4d7-4429-99c8-38322fdd8f61-kubelet-dir\") pod \"00ba9854-d4d7-4429-99c8-38322fdd8f61\" (UID: \"00ba9854-d4d7-4429-99c8-38322fdd8f61\") " Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.817761 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00ba9854-d4d7-4429-99c8-38322fdd8f61-kube-api-access\") pod \"00ba9854-d4d7-4429-99c8-38322fdd8f61\" (UID: \"00ba9854-d4d7-4429-99c8-38322fdd8f61\") " Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.819133 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ba9854-d4d7-4429-99c8-38322fdd8f61-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "00ba9854-d4d7-4429-99c8-38322fdd8f61" (UID: "00ba9854-d4d7-4429-99c8-38322fdd8f61"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.829233 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ba9854-d4d7-4429-99c8-38322fdd8f61-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "00ba9854-d4d7-4429-99c8-38322fdd8f61" (UID: "00ba9854-d4d7-4429-99c8-38322fdd8f61"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.919016 5200 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00ba9854-d4d7-4429-99c8-38322fdd8f61-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:32:59 crc kubenswrapper[5200]: I1126 13:32:59.919068 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00ba9854-d4d7-4429-99c8-38322fdd8f61-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.221821 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.221905 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.276266 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.469627 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"00ba9854-d4d7-4429-99c8-38322fdd8f61","Type":"ContainerDied","Data":"980b9bf7211a58e1f092d84e0b77407aeba605b62a9e9ccc1aa79700d90d8c74"} Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.469706 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="980b9bf7211a58e1f092d84e0b77407aeba605b62a9e9ccc1aa79700d90d8c74" Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.471537 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.504087 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.504173 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:33:00 crc kubenswrapper[5200]: I1126 13:33:00.547492 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:33:01 crc kubenswrapper[5200]: I1126 13:33:01.127225 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:33:01 crc kubenswrapper[5200]: I1126 13:33:01.127882 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:33:01 crc kubenswrapper[5200]: I1126 13:33:01.528234 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:33:01 crc kubenswrapper[5200]: I1126 13:33:01.528845 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:33:02 crc kubenswrapper[5200]: I1126 13:33:02.183519 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cdfrr" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="registry-server" probeResult="failure" output=< Nov 26 13:33:02 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 13:33:02 crc kubenswrapper[5200]: > Nov 26 13:33:02 crc kubenswrapper[5200]: I1126 13:33:02.579188 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bfdtt" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="registry-server" probeResult="failure" output=< Nov 26 13:33:02 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 13:33:02 crc kubenswrapper[5200]: > Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.739548 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.741290 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" containerName="kube-multus-additional-cni-plugins" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.741311 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" containerName="kube-multus-additional-cni-plugins" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.741360 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00ba9854-d4d7-4429-99c8-38322fdd8f61" containerName="pruner" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.741369 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ba9854-d4d7-4429-99c8-38322fdd8f61" containerName="pruner" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.741483 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5a6b53d-f6d4-47c3-96e6-a20b6fec1686" containerName="kube-multus-additional-cni-plugins" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.741498 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="00ba9854-d4d7-4429-99c8-38322fdd8f61" containerName="pruner" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.748023 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.748221 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.751400 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.751717 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.841300 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.841599 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.943753 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.944291 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.943890 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:06 crc kubenswrapper[5200]: I1126 13:33:06.966917 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:07 crc kubenswrapper[5200]: I1126 13:33:07.064861 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:07 crc kubenswrapper[5200]: I1126 13:33:07.200421 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fhkjl" Nov 26 13:33:07 crc kubenswrapper[5200]: I1126 13:33:07.531262 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Nov 26 13:33:07 crc kubenswrapper[5200]: I1126 13:33:07.900780 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-djxct" Nov 26 13:33:07 crc kubenswrapper[5200]: I1126 13:33:07.900884 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djxct" Nov 26 13:33:07 crc kubenswrapper[5200]: I1126 13:33:07.974695 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djxct" Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.474653 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.514865 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.522040 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.526828 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"d63ff5b9-e50a-4c5b-9990-c5e32febac22","Type":"ContainerStarted","Data":"7e56133133fadf0f7fe9048be91664695bd020a32df8884de2460924283f6ec3"} Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.526873 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"d63ff5b9-e50a-4c5b-9990-c5e32febac22","Type":"ContainerStarted","Data":"f7696e6fa26c801ae41ddd6eec0580510bf47caf3f1ffd54d112d659e6008fd6"} Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.554487 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-12-crc" podStartSLOduration=2.554467152 podStartE2EDuration="2.554467152s" podCreationTimestamp="2025-11-26 13:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:33:08.551669944 +0000 UTC m=+157.230871762" watchObservedRunningTime="2025-11-26 13:33:08.554467152 +0000 UTC m=+157.233668970" Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.568082 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.610971 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djxct" Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.643813 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:33:08 crc kubenswrapper[5200]: I1126 13:33:08.684450 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:33:09 crc kubenswrapper[5200]: I1126 13:33:09.534181 5200 generic.go:358] "Generic (PLEG): container finished" podID="d63ff5b9-e50a-4c5b-9990-c5e32febac22" containerID="7e56133133fadf0f7fe9048be91664695bd020a32df8884de2460924283f6ec3" exitCode=0 Nov 26 13:33:09 crc kubenswrapper[5200]: I1126 13:33:09.534782 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"d63ff5b9-e50a-4c5b-9990-c5e32febac22","Type":"ContainerDied","Data":"7e56133133fadf0f7fe9048be91664695bd020a32df8884de2460924283f6ec3"} Nov 26 13:33:09 crc kubenswrapper[5200]: I1126 13:33:09.801260 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xskxw"] Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.541966 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xskxw" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="registry-server" containerID="cri-o://34356272cb6a1a1d716568a895df5ecc8edfe779ae69040fb5230677815eb8e5" gracePeriod=2 Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.802883 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fs86m"] Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.803168 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fs86m" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="registry-server" containerID="cri-o://8a82977cd8792981b3d03563fc68949e2dffd3090407b4e7342353f48cf8b1a5" gracePeriod=2 Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.823850 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.916296 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kube-api-access\") pod \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\" (UID: \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\") " Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.916436 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kubelet-dir\") pod \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\" (UID: \"d63ff5b9-e50a-4c5b-9990-c5e32febac22\") " Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.916599 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d63ff5b9-e50a-4c5b-9990-c5e32febac22" (UID: "d63ff5b9-e50a-4c5b-9990-c5e32febac22"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.917217 5200 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:10 crc kubenswrapper[5200]: I1126 13:33:10.922870 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d63ff5b9-e50a-4c5b-9990-c5e32febac22" (UID: "d63ff5b9-e50a-4c5b-9990-c5e32febac22"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.019177 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d63ff5b9-e50a-4c5b-9990-c5e32febac22-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.197061 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.237838 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.520413 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.524448 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.553901 5200 generic.go:358] "Generic (PLEG): container finished" podID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerID="34356272cb6a1a1d716568a895df5ecc8edfe779ae69040fb5230677815eb8e5" exitCode=0 Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.554035 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskxw" event={"ID":"527846e5-1964-47c9-97ab-a2a1cb70074a","Type":"ContainerDied","Data":"34356272cb6a1a1d716568a895df5ecc8edfe779ae69040fb5230677815eb8e5"} Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.556178 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"d63ff5b9-e50a-4c5b-9990-c5e32febac22","Type":"ContainerDied","Data":"f7696e6fa26c801ae41ddd6eec0580510bf47caf3f1ffd54d112d659e6008fd6"} Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.556338 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7696e6fa26c801ae41ddd6eec0580510bf47caf3f1ffd54d112d659e6008fd6" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.556636 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.588140 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:33:11 crc kubenswrapper[5200]: I1126 13:33:11.632862 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.251990 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.336383 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-utilities\") pod \"527846e5-1964-47c9-97ab-a2a1cb70074a\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.336487 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxxl2\" (UniqueName: \"kubernetes.io/projected/527846e5-1964-47c9-97ab-a2a1cb70074a-kube-api-access-xxxl2\") pod \"527846e5-1964-47c9-97ab-a2a1cb70074a\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.336673 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-catalog-content\") pod \"527846e5-1964-47c9-97ab-a2a1cb70074a\" (UID: \"527846e5-1964-47c9-97ab-a2a1cb70074a\") " Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.337724 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-utilities" (OuterVolumeSpecName: "utilities") pod "527846e5-1964-47c9-97ab-a2a1cb70074a" (UID: "527846e5-1964-47c9-97ab-a2a1cb70074a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.348834 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527846e5-1964-47c9-97ab-a2a1cb70074a-kube-api-access-xxxl2" (OuterVolumeSpecName: "kube-api-access-xxxl2") pod "527846e5-1964-47c9-97ab-a2a1cb70074a" (UID: "527846e5-1964-47c9-97ab-a2a1cb70074a"). InnerVolumeSpecName "kube-api-access-xxxl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.359983 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "527846e5-1964-47c9-97ab-a2a1cb70074a" (UID: "527846e5-1964-47c9-97ab-a2a1cb70074a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.437934 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.437973 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527846e5-1964-47c9-97ab-a2a1cb70074a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.437987 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxxl2\" (UniqueName: \"kubernetes.io/projected/527846e5-1964-47c9-97ab-a2a1cb70074a-kube-api-access-xxxl2\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.565654 5200 generic.go:358] "Generic (PLEG): container finished" podID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerID="8a82977cd8792981b3d03563fc68949e2dffd3090407b4e7342353f48cf8b1a5" exitCode=0 Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.565741 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs86m" event={"ID":"46257d12-db68-4b03-8ba0-d097f5da18ab","Type":"ContainerDied","Data":"8a82977cd8792981b3d03563fc68949e2dffd3090407b4e7342353f48cf8b1a5"} Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.565799 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs86m" event={"ID":"46257d12-db68-4b03-8ba0-d097f5da18ab","Type":"ContainerDied","Data":"e4b66cbcf42fc889a079618bd1995c2a579d63659ff960e1a1f2a46ebd5f4f6f"} Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.565814 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4b66cbcf42fc889a079618bd1995c2a579d63659ff960e1a1f2a46ebd5f4f6f" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.566900 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.569564 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xskxw" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.570020 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xskxw" event={"ID":"527846e5-1964-47c9-97ab-a2a1cb70074a","Type":"ContainerDied","Data":"8a261f10996a83491611bf00deff26eb0890d3655925d12b89b12658ddfd97ee"} Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.570119 5200 scope.go:117] "RemoveContainer" containerID="34356272cb6a1a1d716568a895df5ecc8edfe779ae69040fb5230677815eb8e5" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.607085 5200 scope.go:117] "RemoveContainer" containerID="545445d95788daa57a7ba08572d90ef60785649b98053e45c5b18683e83d2926" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.609670 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xskxw"] Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.612509 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xskxw"] Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.630193 5200 scope.go:117] "RemoveContainer" containerID="0c4893b16874fb3bcc59dbc7642d13eefb4346545b775ce9d453c3e38d1c0a01" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.640311 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-catalog-content\") pod \"46257d12-db68-4b03-8ba0-d097f5da18ab\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.640369 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-utilities\") pod \"46257d12-db68-4b03-8ba0-d097f5da18ab\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.640416 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzw5d\" (UniqueName: \"kubernetes.io/projected/46257d12-db68-4b03-8ba0-d097f5da18ab-kube-api-access-rzw5d\") pod \"46257d12-db68-4b03-8ba0-d097f5da18ab\" (UID: \"46257d12-db68-4b03-8ba0-d097f5da18ab\") " Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.642293 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-utilities" (OuterVolumeSpecName: "utilities") pod "46257d12-db68-4b03-8ba0-d097f5da18ab" (UID: "46257d12-db68-4b03-8ba0-d097f5da18ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.645552 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46257d12-db68-4b03-8ba0-d097f5da18ab-kube-api-access-rzw5d" (OuterVolumeSpecName: "kube-api-access-rzw5d") pod "46257d12-db68-4b03-8ba0-d097f5da18ab" (UID: "46257d12-db68-4b03-8ba0-d097f5da18ab"). InnerVolumeSpecName "kube-api-access-rzw5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.703117 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46257d12-db68-4b03-8ba0-d097f5da18ab" (UID: "46257d12-db68-4b03-8ba0-d097f5da18ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.741600 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.741641 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46257d12-db68-4b03-8ba0-d097f5da18ab-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.741660 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzw5d\" (UniqueName: \"kubernetes.io/projected/46257d12-db68-4b03-8ba0-d097f5da18ab-kube-api-access-rzw5d\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.930448 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931034 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="extract-utilities" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931047 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="extract-utilities" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931058 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d63ff5b9-e50a-4c5b-9990-c5e32febac22" containerName="pruner" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931064 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63ff5b9-e50a-4c5b-9990-c5e32febac22" containerName="pruner" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931075 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="extract-utilities" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931080 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="extract-utilities" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931093 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="registry-server" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931099 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="registry-server" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931108 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="extract-content" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931113 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="extract-content" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931125 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="extract-content" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931130 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="extract-content" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931151 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="registry-server" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931156 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="registry-server" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931254 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" containerName="registry-server" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931267 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" containerName="registry-server" Nov 26 13:33:12 crc kubenswrapper[5200]: I1126 13:33:12.931277 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d63ff5b9-e50a-4c5b-9990-c5e32febac22" containerName="pruner" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.834334 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fs86m" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.834784 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.838905 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.839145 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.841797 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527846e5-1964-47c9-97ab-a2a1cb70074a" path="/var/lib/kubelet/pods/527846e5-1964-47c9-97ab-a2a1cb70074a/volumes" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.842664 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.859097 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-var-lock\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.859271 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af2aea9e-f4a1-4558-812f-eab1f974436d-kube-api-access\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.859316 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-kubelet-dir\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.885734 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fs86m"] Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.891048 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fs86m"] Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.960404 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af2aea9e-f4a1-4558-812f-eab1f974436d-kube-api-access\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.960473 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-kubelet-dir\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.960519 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-var-lock\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.960615 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-var-lock\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.960753 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-kubelet-dir\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:13 crc kubenswrapper[5200]: I1126 13:33:13.980786 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af2aea9e-f4a1-4558-812f-eab1f974436d-kube-api-access\") pod \"installer-12-crc\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.150664 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.358421 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.587250 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"af2aea9e-f4a1-4558-812f-eab1f974436d","Type":"ContainerStarted","Data":"2d9d1a46a2131f88cd2c413b533a2f0aec5c1ca79d9b842b6039944df28399cd"} Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.607997 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bfdtt"] Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.608805 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bfdtt" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="registry-server" containerID="cri-o://b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a" gracePeriod=2 Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.936515 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.976791 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46257d12-db68-4b03-8ba0-d097f5da18ab" path="/var/lib/kubelet/pods/46257d12-db68-4b03-8ba0-d097f5da18ab/volumes" Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.980604 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-catalog-content\") pod \"68d9317f-7f72-4ac5-babc-bf891defbbd9\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.990551 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9htgk\" (UniqueName: \"kubernetes.io/projected/68d9317f-7f72-4ac5-babc-bf891defbbd9-kube-api-access-9htgk\") pod \"68d9317f-7f72-4ac5-babc-bf891defbbd9\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.990626 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-utilities\") pod \"68d9317f-7f72-4ac5-babc-bf891defbbd9\" (UID: \"68d9317f-7f72-4ac5-babc-bf891defbbd9\") " Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.994239 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-utilities" (OuterVolumeSpecName: "utilities") pod "68d9317f-7f72-4ac5-babc-bf891defbbd9" (UID: "68d9317f-7f72-4ac5-babc-bf891defbbd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:14 crc kubenswrapper[5200]: I1126 13:33:14.998884 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d9317f-7f72-4ac5-babc-bf891defbbd9-kube-api-access-9htgk" (OuterVolumeSpecName: "kube-api-access-9htgk") pod "68d9317f-7f72-4ac5-babc-bf891defbbd9" (UID: "68d9317f-7f72-4ac5-babc-bf891defbbd9"). InnerVolumeSpecName "kube-api-access-9htgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.066612 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68d9317f-7f72-4ac5-babc-bf891defbbd9" (UID: "68d9317f-7f72-4ac5-babc-bf891defbbd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.092221 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.092267 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9htgk\" (UniqueName: \"kubernetes.io/projected/68d9317f-7f72-4ac5-babc-bf891defbbd9-kube-api-access-9htgk\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.092278 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d9317f-7f72-4ac5-babc-bf891defbbd9-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.204888 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bpb"] Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.205541 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x5bpb" podUID="b360c828-8ede-436a-8399-b14caf072e00" containerName="registry-server" containerID="cri-o://578c5fe47ba948654d01cab81525d7e0686ba814ee276265afff07b94c6ea598" gracePeriod=2 Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.595275 5200 generic.go:358] "Generic (PLEG): container finished" podID="b360c828-8ede-436a-8399-b14caf072e00" containerID="578c5fe47ba948654d01cab81525d7e0686ba814ee276265afff07b94c6ea598" exitCode=0 Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.595386 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bpb" event={"ID":"b360c828-8ede-436a-8399-b14caf072e00","Type":"ContainerDied","Data":"578c5fe47ba948654d01cab81525d7e0686ba814ee276265afff07b94c6ea598"} Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.600678 5200 generic.go:358] "Generic (PLEG): container finished" podID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerID="b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a" exitCode=0 Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.600783 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bfdtt" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.600786 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfdtt" event={"ID":"68d9317f-7f72-4ac5-babc-bf891defbbd9","Type":"ContainerDied","Data":"b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a"} Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.600820 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfdtt" event={"ID":"68d9317f-7f72-4ac5-babc-bf891defbbd9","Type":"ContainerDied","Data":"a37d48826d3816dfff5eddb654e1948086b4c2642de15eae2fb9eeef1244cbb0"} Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.600840 5200 scope.go:117] "RemoveContainer" containerID="b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.605322 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"af2aea9e-f4a1-4558-812f-eab1f974436d","Type":"ContainerStarted","Data":"d4e367c987a9a1e2dcfa8c41bfda57e107193ebcda042e3de4ce792d6949f7e6"} Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.624769 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-12-crc" podStartSLOduration=3.62474591 podStartE2EDuration="3.62474591s" podCreationTimestamp="2025-11-26 13:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:33:15.622049866 +0000 UTC m=+164.301251684" watchObservedRunningTime="2025-11-26 13:33:15.62474591 +0000 UTC m=+164.303947728" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.638071 5200 scope.go:117] "RemoveContainer" containerID="45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.642633 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bfdtt"] Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.649030 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bfdtt"] Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.665032 5200 scope.go:117] "RemoveContainer" containerID="a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.701467 5200 scope.go:117] "RemoveContainer" containerID="b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a" Nov 26 13:33:15 crc kubenswrapper[5200]: E1126 13:33:15.701939 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a\": container with ID starting with b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a not found: ID does not exist" containerID="b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.701974 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a"} err="failed to get container status \"b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a\": rpc error: code = NotFound desc = could not find container \"b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a\": container with ID starting with b73a79abee53ec35bfd504888f8028f6e6aa3c137e1af6179418471ca15ddd7a not found: ID does not exist" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.701998 5200 scope.go:117] "RemoveContainer" containerID="45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a" Nov 26 13:33:15 crc kubenswrapper[5200]: E1126 13:33:15.702233 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a\": container with ID starting with 45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a not found: ID does not exist" containerID="45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.702251 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a"} err="failed to get container status \"45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a\": rpc error: code = NotFound desc = could not find container \"45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a\": container with ID starting with 45adf47f24501867b02f739477a780cff59c1f482d03d819b719def2e4f7772a not found: ID does not exist" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.702267 5200 scope.go:117] "RemoveContainer" containerID="a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f" Nov 26 13:33:15 crc kubenswrapper[5200]: E1126 13:33:15.702532 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f\": container with ID starting with a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f not found: ID does not exist" containerID="a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.702552 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f"} err="failed to get container status \"a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f\": rpc error: code = NotFound desc = could not find container \"a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f\": container with ID starting with a1f930e414c591bedad6b764ecde3f213596099018f548e29dffffaebab2624f not found: ID does not exist" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.711494 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.804555 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s45sd\" (UniqueName: \"kubernetes.io/projected/b360c828-8ede-436a-8399-b14caf072e00-kube-api-access-s45sd\") pod \"b360c828-8ede-436a-8399-b14caf072e00\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.804774 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-utilities\") pod \"b360c828-8ede-436a-8399-b14caf072e00\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.805638 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-utilities" (OuterVolumeSpecName: "utilities") pod "b360c828-8ede-436a-8399-b14caf072e00" (UID: "b360c828-8ede-436a-8399-b14caf072e00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.805906 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-catalog-content\") pod \"b360c828-8ede-436a-8399-b14caf072e00\" (UID: \"b360c828-8ede-436a-8399-b14caf072e00\") " Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.812200 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b360c828-8ede-436a-8399-b14caf072e00-kube-api-access-s45sd" (OuterVolumeSpecName: "kube-api-access-s45sd") pod "b360c828-8ede-436a-8399-b14caf072e00" (UID: "b360c828-8ede-436a-8399-b14caf072e00"). InnerVolumeSpecName "kube-api-access-s45sd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.815003 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b360c828-8ede-436a-8399-b14caf072e00" (UID: "b360c828-8ede-436a-8399-b14caf072e00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.815373 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.815395 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s45sd\" (UniqueName: \"kubernetes.io/projected/b360c828-8ede-436a-8399-b14caf072e00-kube-api-access-s45sd\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:15 crc kubenswrapper[5200]: I1126 13:33:15.815408 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b360c828-8ede-436a-8399-b14caf072e00-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.614774 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5bpb" event={"ID":"b360c828-8ede-436a-8399-b14caf072e00","Type":"ContainerDied","Data":"791e6f7190b4015dff19601a292ed06492c85af9c78622957c1eff7d1e57af43"} Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.615217 5200 scope.go:117] "RemoveContainer" containerID="578c5fe47ba948654d01cab81525d7e0686ba814ee276265afff07b94c6ea598" Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.614896 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5bpb" Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.640503 5200 scope.go:117] "RemoveContainer" containerID="4a696494b883ae785d71c6c3cfc00da08a1bfaf86b1b8fdc441694fc46f119d0" Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.653770 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bpb"] Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.657279 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5bpb"] Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.673491 5200 scope.go:117] "RemoveContainer" containerID="30451a5da751385cb27f0f6fa1550a08bc5e0f10ccc2979f9436c8d4b9898b6f" Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.978357 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" path="/var/lib/kubelet/pods/68d9317f-7f72-4ac5-babc-bf891defbbd9/volumes" Nov 26 13:33:16 crc kubenswrapper[5200]: I1126 13:33:16.979078 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b360c828-8ede-436a-8399-b14caf072e00" path="/var/lib/kubelet/pods/b360c828-8ede-436a-8399-b14caf072e00/volumes" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.540059 5200 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.639864 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-nmkr7"] Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.671312 5200 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672084 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="registry-server" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672113 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="registry-server" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672129 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b360c828-8ede-436a-8399-b14caf072e00" containerName="registry-server" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672139 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b360c828-8ede-436a-8399-b14caf072e00" containerName="registry-server" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672151 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b360c828-8ede-436a-8399-b14caf072e00" containerName="extract-utilities" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672159 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b360c828-8ede-436a-8399-b14caf072e00" containerName="extract-utilities" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672172 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="extract-content" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672179 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="extract-content" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672187 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b360c828-8ede-436a-8399-b14caf072e00" containerName="extract-content" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672194 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b360c828-8ede-436a-8399-b14caf072e00" containerName="extract-content" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672205 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="extract-utilities" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672212 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="extract-utilities" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672334 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="68d9317f-7f72-4ac5-babc-bf891defbbd9" containerName="registry-server" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.672356 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b360c828-8ede-436a-8399-b14caf072e00" containerName="registry-server" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.686332 5200 kubelet.go:2547] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.686383 5200 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.686967 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687847 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687869 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687881 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687891 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687918 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687926 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687934 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687939 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687956 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687961 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687935 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" containerID="cri-o://5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1" gracePeriod=15 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687970 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687976 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687986 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="setup" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.687991 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="setup" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688075 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5" gracePeriod=15 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688128 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77" gracePeriod=15 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688131 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625" gracePeriod=15 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688246 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54" gracePeriod=15 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688079 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688295 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688306 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688315 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688324 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.688333 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.703204 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="3a14caf222afb62aaabdc47808b6f944" podUID="57755cc5f99000cc11e193051474d4e2" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.735115 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.790258 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.790896 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.791059 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.791133 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.791209 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.791302 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.791376 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.791446 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.791518 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.791596 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.879172 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.880351 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.881010 5200 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5" exitCode=0 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.881035 5200 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54" exitCode=0 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.881043 5200 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625" exitCode=0 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.881049 5200 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77" exitCode=2 Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.881178 5200 scope.go:117] "RemoveContainer" containerID="71e3a552bee1d75455cc218874ef2ab8f739c73cd2ea145aeb7f16c619884376" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.892934 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893053 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893162 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893252 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893343 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893263 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893487 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893567 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893575 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893750 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893839 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893641 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893915 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.893921 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.894008 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.894060 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.894076 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.894182 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.894296 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:52 crc kubenswrapper[5200]: I1126 13:33:52.894370 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:53 crc kubenswrapper[5200]: E1126 13:33:53.843821 5200 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:53 crc kubenswrapper[5200]: E1126 13:33:53.844722 5200 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:53 crc kubenswrapper[5200]: E1126 13:33:53.845417 5200 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:53 crc kubenswrapper[5200]: E1126 13:33:53.846034 5200 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:53 crc kubenswrapper[5200]: E1126 13:33:53.846513 5200 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:53 crc kubenswrapper[5200]: I1126 13:33:53.846570 5200 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 26 13:33:53 crc kubenswrapper[5200]: E1126 13:33:53.847122 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Nov 26 13:33:53 crc kubenswrapper[5200]: I1126 13:33:53.892866 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Nov 26 13:33:53 crc kubenswrapper[5200]: I1126 13:33:53.896395 5200 generic.go:358] "Generic (PLEG): container finished" podID="af2aea9e-f4a1-4558-812f-eab1f974436d" containerID="d4e367c987a9a1e2dcfa8c41bfda57e107193ebcda042e3de4ce792d6949f7e6" exitCode=0 Nov 26 13:33:53 crc kubenswrapper[5200]: I1126 13:33:53.896461 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"af2aea9e-f4a1-4558-812f-eab1f974436d","Type":"ContainerDied","Data":"d4e367c987a9a1e2dcfa8c41bfda57e107193ebcda042e3de4ce792d6949f7e6"} Nov 26 13:33:53 crc kubenswrapper[5200]: I1126 13:33:53.897427 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:54 crc kubenswrapper[5200]: E1126 13:33:54.048808 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Nov 26 13:33:54 crc kubenswrapper[5200]: E1126 13:33:54.449725 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.184307 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.186158 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.231312 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-kubelet-dir\") pod \"af2aea9e-f4a1-4558-812f-eab1f974436d\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.231395 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-var-lock\") pod \"af2aea9e-f4a1-4558-812f-eab1f974436d\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.231435 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "af2aea9e-f4a1-4558-812f-eab1f974436d" (UID: "af2aea9e-f4a1-4558-812f-eab1f974436d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.231457 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-var-lock" (OuterVolumeSpecName: "var-lock") pod "af2aea9e-f4a1-4558-812f-eab1f974436d" (UID: "af2aea9e-f4a1-4558-812f-eab1f974436d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.231551 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af2aea9e-f4a1-4558-812f-eab1f974436d-kube-api-access\") pod \"af2aea9e-f4a1-4558-812f-eab1f974436d\" (UID: \"af2aea9e-f4a1-4558-812f-eab1f974436d\") " Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.231769 5200 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.231783 5200 reconciler_common.go:299] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af2aea9e-f4a1-4558-812f-eab1f974436d-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.239852 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2aea9e-f4a1-4558-812f-eab1f974436d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "af2aea9e-f4a1-4558-812f-eab1f974436d" (UID: "af2aea9e-f4a1-4558-812f-eab1f974436d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:33:55 crc kubenswrapper[5200]: E1126 13:33:55.250603 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.333395 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af2aea9e-f4a1-4558-812f-eab1f974436d-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.610776 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.612299 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.612867 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.613300 5200 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.741528 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.741656 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.741866 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.742024 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.742181 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.742191 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.742349 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.742356 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.742805 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir" (OuterVolumeSpecName: "ca-bundle-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "ca-bundle-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.743216 5200 reconciler_common.go:299] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.743246 5200 reconciler_common.go:299] "Volume detached for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.743263 5200 reconciler_common.go:299] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.743275 5200 reconciler_common.go:299] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.745025 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.845879 5200 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.907883 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.909546 5200 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1" exitCode=0 Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.909621 5200 scope.go:117] "RemoveContainer" containerID="0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.909652 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.912486 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"af2aea9e-f4a1-4558-812f-eab1f974436d","Type":"ContainerDied","Data":"2d9d1a46a2131f88cd2c413b533a2f0aec5c1ca79d9b842b6039944df28399cd"} Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.912524 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d9d1a46a2131f88cd2c413b533a2f0aec5c1ca79d9b842b6039944df28399cd" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.912583 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.932875 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.933141 5200 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.933273 5200 scope.go:117] "RemoveContainer" containerID="b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.933350 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.933604 5200 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.956053 5200 scope.go:117] "RemoveContainer" containerID="5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.972955 5200 scope.go:117] "RemoveContainer" containerID="3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77" Nov 26 13:33:55 crc kubenswrapper[5200]: I1126 13:33:55.996787 5200 scope.go:117] "RemoveContainer" containerID="5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.023794 5200 scope.go:117] "RemoveContainer" containerID="6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.118635 5200 scope.go:117] "RemoveContainer" containerID="0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5" Nov 26 13:33:56 crc kubenswrapper[5200]: E1126 13:33:56.119630 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5\": container with ID starting with 0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5 not found: ID does not exist" containerID="0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.119679 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5"} err="failed to get container status \"0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5\": rpc error: code = NotFound desc = could not find container \"0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5\": container with ID starting with 0a2b379acdd02d212fa3435743a25324575aeba1fb43568de3fe1335d70781d5 not found: ID does not exist" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.119741 5200 scope.go:117] "RemoveContainer" containerID="b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54" Nov 26 13:33:56 crc kubenswrapper[5200]: E1126 13:33:56.120511 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\": container with ID starting with b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54 not found: ID does not exist" containerID="b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.120556 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54"} err="failed to get container status \"b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\": rpc error: code = NotFound desc = could not find container \"b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54\": container with ID starting with b600dcf6dfc8cead2b66d5818653a5d24d4c6c49f7b3b0ca6b2282b0c6bbca54 not found: ID does not exist" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.120584 5200 scope.go:117] "RemoveContainer" containerID="5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625" Nov 26 13:33:56 crc kubenswrapper[5200]: E1126 13:33:56.120905 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\": container with ID starting with 5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625 not found: ID does not exist" containerID="5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.120939 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625"} err="failed to get container status \"5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\": rpc error: code = NotFound desc = could not find container \"5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625\": container with ID starting with 5acf793f545f45990a346fca21455b4e3e1147f80c9fae0a4a7ecb3449caf625 not found: ID does not exist" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.120955 5200 scope.go:117] "RemoveContainer" containerID="3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77" Nov 26 13:33:56 crc kubenswrapper[5200]: E1126 13:33:56.121279 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\": container with ID starting with 3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77 not found: ID does not exist" containerID="3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.121320 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77"} err="failed to get container status \"3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\": rpc error: code = NotFound desc = could not find container \"3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77\": container with ID starting with 3e6de8a6ea99fb2aa1debdcfc53d556693b5d8c75b9aadf3f6f1215c4a7c9d77 not found: ID does not exist" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.121349 5200 scope.go:117] "RemoveContainer" containerID="5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1" Nov 26 13:33:56 crc kubenswrapper[5200]: E1126 13:33:56.121648 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\": container with ID starting with 5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1 not found: ID does not exist" containerID="5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.121670 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1"} err="failed to get container status \"5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\": rpc error: code = NotFound desc = could not find container \"5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1\": container with ID starting with 5924825b7c4e00ef14a6b11c7c04c2e1dde4418c99691645b27cb9ab987da9e1 not found: ID does not exist" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.121709 5200 scope.go:117] "RemoveContainer" containerID="6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678" Nov 26 13:33:56 crc kubenswrapper[5200]: E1126 13:33:56.121957 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\": container with ID starting with 6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678 not found: ID does not exist" containerID="6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.121980 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678"} err="failed to get container status \"6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\": rpc error: code = NotFound desc = could not find container \"6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678\": container with ID starting with 6b75737c2c84a2ee0f1aa9117afe5205bae4c3cf51297bb656520192dc981678 not found: ID does not exist" Nov 26 13:33:56 crc kubenswrapper[5200]: E1126 13:33:56.851771 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="3.2s" Nov 26 13:33:56 crc kubenswrapper[5200]: I1126 13:33:56.976762 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a14caf222afb62aaabdc47808b6f944" path="/var/lib/kubelet/pods/3a14caf222afb62aaabdc47808b6f944/volumes" Nov 26 13:33:57 crc kubenswrapper[5200]: E1126 13:33:57.737459 5200 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:57 crc kubenswrapper[5200]: I1126 13:33:57.738131 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:57 crc kubenswrapper[5200]: W1126 13:33:57.768703 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7dbc7e1ee9c187a863ef9b473fad27b.slice/crio-ba40cef49a1528465e3762a902c2dd19b3a4cabc7dacb9ced631b1b37ea6dcaf WatchSource:0}: Error finding container ba40cef49a1528465e3762a902c2dd19b3a4cabc7dacb9ced631b1b37ea6dcaf: Status 404 returned error can't find the container with id ba40cef49a1528465e3762a902c2dd19b3a4cabc7dacb9ced631b1b37ea6dcaf Nov 26 13:33:57 crc kubenswrapper[5200]: E1126 13:33:57.773918 5200 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b91d86cd8f9a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f7dbc7e1ee9c187a863ef9b473fad27b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 13:33:57.770803623 +0000 UTC m=+206.450005441,LastTimestamp:2025-11-26 13:33:57.770803623 +0000 UTC m=+206.450005441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 13:33:57 crc kubenswrapper[5200]: I1126 13:33:57.928933 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f7dbc7e1ee9c187a863ef9b473fad27b","Type":"ContainerStarted","Data":"ba40cef49a1528465e3762a902c2dd19b3a4cabc7dacb9ced631b1b37ea6dcaf"} Nov 26 13:33:58 crc kubenswrapper[5200]: I1126 13:33:58.936622 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f7dbc7e1ee9c187a863ef9b473fad27b","Type":"ContainerStarted","Data":"54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702"} Nov 26 13:33:58 crc kubenswrapper[5200]: I1126 13:33:58.937323 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:33:58 crc kubenswrapper[5200]: I1126 13:33:58.938162 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:58 crc kubenswrapper[5200]: E1126 13:33:58.939028 5200 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:59 crc kubenswrapper[5200]: I1126 13:33:59.944342 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:33:59 crc kubenswrapper[5200]: E1126 13:33:59.944962 5200 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:34:00 crc kubenswrapper[5200]: E1126 13:34:00.053404 5200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="6.4s" Nov 26 13:34:02 crc kubenswrapper[5200]: I1126 13:34:02.977698 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:34:03 crc kubenswrapper[5200]: I1126 13:34:03.154884 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:34:03 crc kubenswrapper[5200]: I1126 13:34:03.154977 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:34:03 crc kubenswrapper[5200]: E1126 13:34:03.790421 5200 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b91d86cd8f9a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f7dbc7e1ee9c187a863ef9b473fad27b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 13:33:57.770803623 +0000 UTC m=+206.450005441,LastTimestamp:2025-11-26 13:33:57.770803623 +0000 UTC m=+206.450005441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 13:34:03 crc kubenswrapper[5200]: I1126 13:34:03.969051 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:03 crc kubenswrapper[5200]: I1126 13:34:03.969951 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:34:03 crc kubenswrapper[5200]: I1126 13:34:03.990453 5200 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:03 crc kubenswrapper[5200]: I1126 13:34:03.990487 5200 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:03 crc kubenswrapper[5200]: E1126 13:34:03.991032 5200 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:03 crc kubenswrapper[5200]: I1126 13:34:03.991436 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:04 crc kubenswrapper[5200]: I1126 13:34:04.981169 5200 generic.go:358] "Generic (PLEG): container finished" podID="57755cc5f99000cc11e193051474d4e2" containerID="67af5a992b481927ee7c805924da1f9807a684fee37e737eb864ee2a150ed60f" exitCode=0 Nov 26 13:34:04 crc kubenswrapper[5200]: I1126 13:34:04.981281 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerDied","Data":"67af5a992b481927ee7c805924da1f9807a684fee37e737eb864ee2a150ed60f"} Nov 26 13:34:04 crc kubenswrapper[5200]: I1126 13:34:04.981603 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"d86b0acb22ca109af60f51004163470659cbf82a69a1627a670bfaaa6f0e5cc8"} Nov 26 13:34:04 crc kubenswrapper[5200]: I1126 13:34:04.981938 5200 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:04 crc kubenswrapper[5200]: I1126 13:34:04.981952 5200 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:04 crc kubenswrapper[5200]: E1126 13:34:04.982879 5200 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:04 crc kubenswrapper[5200]: I1126 13:34:04.982626 5200 status_manager.go:895] "Failed to get status for pod" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Nov 26 13:34:06 crc kubenswrapper[5200]: I1126 13:34:06.005972 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"6424ceeffd9464c2f43232a09eb854a48901328219dd888bcb2b3b1ea2f7884e"} Nov 26 13:34:06 crc kubenswrapper[5200]: I1126 13:34:06.006297 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"d80c23b75d493f9699c6a5d7bc1251df656abc42ecfdc0877a7e58da6877fe1b"} Nov 26 13:34:06 crc kubenswrapper[5200]: I1126 13:34:06.006311 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"09b50dc53fa6490d9370a37f55429e78a85d2ac6a33b64993a308871bd054a7f"} Nov 26 13:34:07 crc kubenswrapper[5200]: I1126 13:34:07.015591 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"1719c69a90292a80bae03bbd0948b1a2cac49bb92bef527a9889a58d029821b0"} Nov 26 13:34:07 crc kubenswrapper[5200]: I1126 13:34:07.015636 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"247e9660ec49530c1c2b5a0dc011eb96dc947cf8b48be39b56c96654f4d32726"} Nov 26 13:34:07 crc kubenswrapper[5200]: I1126 13:34:07.015789 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:07 crc kubenswrapper[5200]: I1126 13:34:07.015896 5200 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:07 crc kubenswrapper[5200]: I1126 13:34:07.015921 5200 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:07 crc kubenswrapper[5200]: I1126 13:34:07.391057 5200 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:60714->192.168.126.11:10257: read: connection reset by peer" start-of-body= Nov 26 13:34:07 crc kubenswrapper[5200]: I1126 13:34:07.391178 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:60714->192.168.126.11:10257: read: connection reset by peer" Nov 26 13:34:08 crc kubenswrapper[5200]: I1126 13:34:08.024936 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Nov 26 13:34:08 crc kubenswrapper[5200]: I1126 13:34:08.024981 5200 generic.go:358] "Generic (PLEG): container finished" podID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerID="0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd" exitCode=1 Nov 26 13:34:08 crc kubenswrapper[5200]: I1126 13:34:08.025043 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerDied","Data":"0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd"} Nov 26 13:34:08 crc kubenswrapper[5200]: I1126 13:34:08.025458 5200 scope.go:117] "RemoveContainer" containerID="0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd" Nov 26 13:34:08 crc kubenswrapper[5200]: I1126 13:34:08.992119 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:08 crc kubenswrapper[5200]: I1126 13:34:08.992429 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:09 crc kubenswrapper[5200]: I1126 13:34:09.001289 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:09 crc kubenswrapper[5200]: I1126 13:34:09.046647 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Nov 26 13:34:09 crc kubenswrapper[5200]: I1126 13:34:09.046856 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"773a49b041ecb3acd25da26e90528df4c7e81391a4ed66d60a4cd823bac3f101"} Nov 26 13:34:11 crc kubenswrapper[5200]: I1126 13:34:11.207159 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:34:11 crc kubenswrapper[5200]: I1126 13:34:11.209797 5200 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 26 13:34:11 crc kubenswrapper[5200]: I1126 13:34:11.210211 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 26 13:34:12 crc kubenswrapper[5200]: I1126 13:34:12.026986 5200 kubelet.go:3329] "Deleted mirror pod as it didn't match the static Pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:12 crc kubenswrapper[5200]: I1126 13:34:12.027319 5200 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:12 crc kubenswrapper[5200]: I1126 13:34:12.066325 5200 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:12 crc kubenswrapper[5200]: I1126 13:34:12.066360 5200 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:12 crc kubenswrapper[5200]: I1126 13:34:12.070463 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:12 crc kubenswrapper[5200]: I1126 13:34:12.151544 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:34:13 crc kubenswrapper[5200]: I1126 13:34:13.006296 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="57755cc5f99000cc11e193051474d4e2" podUID="e37f27a0-f559-4cd4-b55c-458db83e3d20" Nov 26 13:34:13 crc kubenswrapper[5200]: I1126 13:34:13.072469 5200 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:13 crc kubenswrapper[5200]: I1126 13:34:13.072496 5200 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:13 crc kubenswrapper[5200]: I1126 13:34:13.075831 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="57755cc5f99000cc11e193051474d4e2" podUID="e37f27a0-f559-4cd4-b55c-458db83e3d20" Nov 26 13:34:17 crc kubenswrapper[5200]: I1126 13:34:17.678880 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" podUID="50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" containerName="oauth-openshift" containerID="cri-o://03d06edce6a485eae1ef49e15d55331d857733adf09ef67bf19d1a4b681ec607" gracePeriod=15 Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.107756 5200 generic.go:358] "Generic (PLEG): container finished" podID="50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" containerID="03d06edce6a485eae1ef49e15d55331d857733adf09ef67bf19d1a4b681ec607" exitCode=0 Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.107849 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" event={"ID":"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e","Type":"ContainerDied","Data":"03d06edce6a485eae1ef49e15d55331d857733adf09ef67bf19d1a4b681ec607"} Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.166018 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.259874 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-login\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260043 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-cliconfig\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260102 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-provider-selection\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260142 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-trusted-ca-bundle\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260166 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-session\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260201 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-dir\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260246 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-serving-cert\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260268 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-error\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260301 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-router-certs\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260326 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-ocp-branding-template\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260360 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8rpj\" (UniqueName: \"kubernetes.io/projected/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-kube-api-access-r8rpj\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260393 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-policies\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260434 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-service-ca\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260511 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-idp-0-file-data\") pod \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\" (UID: \"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e\") " Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260379 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260961 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.260975 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.261347 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.264106 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.269255 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.269323 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-kube-api-access-r8rpj" (OuterVolumeSpecName: "kube-api-access-r8rpj") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "kube-api-access-r8rpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.269583 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.270180 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.280916 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.281291 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.281967 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.282049 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.282461 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" (UID: "50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361790 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361852 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361873 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361893 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361914 5200 reconciler_common.go:299] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361933 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361951 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361971 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.361989 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.362009 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r8rpj\" (UniqueName: \"kubernetes.io/projected/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-kube-api-access-r8rpj\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.362027 5200 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.362044 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.362061 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:18 crc kubenswrapper[5200]: I1126 13:34:18.362079 5200 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:19 crc kubenswrapper[5200]: I1126 13:34:19.119368 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" event={"ID":"50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e","Type":"ContainerDied","Data":"8c121d22ba660cc2a9582708fa612b4d32d21312fa5b784af1ac470ca8897f02"} Nov 26 13:34:19 crc kubenswrapper[5200]: I1126 13:34:19.119428 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-nmkr7" Nov 26 13:34:19 crc kubenswrapper[5200]: I1126 13:34:19.119816 5200 scope.go:117] "RemoveContainer" containerID="03d06edce6a485eae1ef49e15d55331d857733adf09ef67bf19d1a4b681ec607" Nov 26 13:34:21 crc kubenswrapper[5200]: I1126 13:34:21.207492 5200 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 26 13:34:21 crc kubenswrapper[5200]: I1126 13:34:21.207963 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 26 13:34:22 crc kubenswrapper[5200]: I1126 13:34:22.083449 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:22 crc kubenswrapper[5200]: I1126 13:34:22.732566 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Nov 26 13:34:23 crc kubenswrapper[5200]: I1126 13:34:23.217212 5200 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Nov 26 13:34:23 crc kubenswrapper[5200]: I1126 13:34:23.502830 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-admission-controller-secret\"" Nov 26 13:34:23 crc kubenswrapper[5200]: I1126 13:34:23.677985 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t8n29\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.168477 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.323300 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"etcd-serving-ca\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.428838 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.484734 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-serving-cert\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.527352 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.619980 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.674771 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-mmcpt\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.835768 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"serving-cert\"" Nov 26 13:34:24 crc kubenswrapper[5200]: I1126 13:34:24.919327 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"env-overrides\"" Nov 26 13:34:25 crc kubenswrapper[5200]: I1126 13:34:25.429302 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Nov 26 13:34:25 crc kubenswrapper[5200]: I1126 13:34:25.620408 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Nov 26 13:34:25 crc kubenswrapper[5200]: I1126 13:34:25.681641 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kl6m8\"" Nov 26 13:34:25 crc kubenswrapper[5200]: I1126 13:34:25.797983 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Nov 26 13:34:25 crc kubenswrapper[5200]: I1126 13:34:25.813869 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-config\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.045241 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"kube-root-ca.crt\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.151099 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.314789 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.414324 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.485407 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"image-import-ca\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.667242 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.760187 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.817993 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.897379 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"serving-cert\"" Nov 26 13:34:26 crc kubenswrapper[5200]: I1126 13:34:26.988061 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.029640 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-images\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.055987 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.065159 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.140414 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.172700 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.243219 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.259669 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.311023 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-controller-dockercfg-xnj77\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.351275 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.576966 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tjs74\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.594226 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-tls\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.648086 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.676090 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-metrics\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.699125 5200 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.711133 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serviceaccount-dockercfg-4gqzj\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.714616 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"openshift-service-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.844036 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.873493 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.923866 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:27 crc kubenswrapper[5200]: I1126 13:34:27.945894 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.009737 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.153544 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.222344 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.238920 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"audit-1\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.293769 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-dockercfg-bf7fj\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.424112 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.449289 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-w9nzh\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.703612 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.782796 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:28 crc kubenswrapper[5200]: I1126 13:34:28.825287 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.099036 5200 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.118955 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"ovnkube-identity-cm\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.162913 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"metrics-tls\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.166588 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.169388 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.202846 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.304245 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"config\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.439136 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"encryption-config-1\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.455191 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.510163 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.633120 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kw8fx\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.721535 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.723109 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serving-cert\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.767462 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"kube-root-ca.crt\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.794801 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mcc-proxy-tls\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.855215 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Nov 26 13:34:29 crc kubenswrapper[5200]: I1126 13:34:29.897493 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-serving-cert\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.017883 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-config\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.060868 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"serving-cert\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.319157 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-dockercfg-dzw6b\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.375163 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-ca-bundle\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.463015 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.530390 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.800044 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"proxy-tls\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.824274 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-nl8tp\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.863845 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"ingress-operator-dockercfg-74nwh\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.947457 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\"" Nov 26 13:34:30 crc kubenswrapper[5200]: I1126 13:34:30.988515 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.057124 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"metrics-tls\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.126922 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-client\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.177775 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-dockercfg-2wbn2\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.201144 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-qqw4z\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.204617 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-dockercfg-tnfx9\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.207744 5200 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.207868 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.207956 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.209101 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"773a49b041ecb3acd25da26e90528df4c7e81391a4ed66d60a4cd823bac3f101"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.209263 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="kube-controller-manager" containerID="cri-o://773a49b041ecb3acd25da26e90528df4c7e81391a4ed66d60a4cd823bac3f101" gracePeriod=30 Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.241123 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-dockercfg-4vdnc\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.275375 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"catalog-operator-serving-cert\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.298185 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pgs7\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.345476 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.361277 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-dockercfg-jcmfj\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.369136 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mco-proxy-tls\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.377167 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.385769 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-6c46w\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.420197 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.536241 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-dockercfg-sw6nc\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.572259 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.600663 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.604987 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.608221 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-config\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.635534 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.740608 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2h6bs\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.750933 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.843601 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"dns-operator-dockercfg-wbbsn\"" Nov 26 13:34:31 crc kubenswrapper[5200]: I1126 13:34:31.981866 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.021924 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.130277 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.263726 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"audit-1\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.451296 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kknhg\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.484358 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"openshift-service-ca.crt\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.536849 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"config-operator-serving-cert\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.588900 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.611676 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.765040 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.785655 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.821287 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-sjn6s\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.822238 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-node-identity\"/\"network-node-identity-cert\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.834497 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-root-ca.crt\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.850302 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-dockercfg-gnx66\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.890871 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.910295 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.916219 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-config\"" Nov 26 13:34:32 crc kubenswrapper[5200]: I1126 13:34:32.928776 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.154204 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.154486 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.160173 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-ntnd7\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.162615 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.173749 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.237816 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.295447 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.322314 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.369423 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.375341 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.494480 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.568982 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.578918 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8dkm8\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.687757 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"metrics-tls\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.694577 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.722818 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.768360 5200 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.846068 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.890499 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-config\"" Nov 26 13:34:33 crc kubenswrapper[5200]: I1126 13:34:33.934883 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.010212 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"hostpath-provisioner\"/\"csi-hostpath-provisioner-sa-dockercfg-7dcws\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.092897 5200 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.094577 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.098353 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-nmkr7","openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.098421 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-749fddbf6b-ff6sg"] Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.101167 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" containerName="installer" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.101201 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" containerName="installer" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.101226 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" containerName="oauth-openshift" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.101233 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" containerName="oauth-openshift" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.101429 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="af2aea9e-f4a1-4558-812f-eab1f974436d" containerName="installer" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.101450 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" containerName="oauth-openshift" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.102513 5200 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.102568 5200 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d60e4b9f-fed8-4c81-882c-33ac5ef7275b" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.124545 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.124892 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.129282 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.129356 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.129534 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.129733 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"audit\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.129967 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.130227 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.130312 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-session\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.130380 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"openshift-service-ca.crt\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.130805 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.131326 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.131509 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.132810 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"kube-root-ca.crt\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.132842 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-d2bf2\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.143481 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.148235 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.161600 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.161578804 podStartE2EDuration="22.161578804s" podCreationTimestamp="2025-11-26 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:34:34.155241032 +0000 UTC m=+242.834442860" watchObservedRunningTime="2025-11-26 13:34:34.161578804 +0000 UTC m=+242.840780622" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196420 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-router-certs\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196481 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-service-ca\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196524 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-error\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196630 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196708 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-audit-policies\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196768 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-login\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196840 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-session\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196876 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196925 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05f8613f-5927-4897-95fa-e6796f13a5ee-audit-dir\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.196958 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.197063 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.197095 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqc9f\" (UniqueName: \"kubernetes.io/projected/05f8613f-5927-4897-95fa-e6796f13a5ee-kube-api-access-qqc9f\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.197124 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.197179 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.210135 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.218364 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-djmfg\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.272127 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.299161 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-login\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.299965 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-session\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.299999 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300032 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05f8613f-5927-4897-95fa-e6796f13a5ee-audit-dir\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300053 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300083 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300101 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc9f\" (UniqueName: \"kubernetes.io/projected/05f8613f-5927-4897-95fa-e6796f13a5ee-kube-api-access-qqc9f\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300122 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300154 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300174 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-router-certs\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300198 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-service-ca\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.300193 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05f8613f-5927-4897-95fa-e6796f13a5ee-audit-dir\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.301164 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.301296 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-service-ca\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.302367 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.303573 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"trusted-ca-bundle\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.303840 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-error\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.303902 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.303950 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-audit-policies\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.316029 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05f8613f-5927-4897-95fa-e6796f13a5ee-audit-policies\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.324153 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-router-certs\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.324442 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.325636 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.327161 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-login\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.327493 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-template-error\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.328787 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.334171 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-session\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.340914 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05f8613f-5927-4897-95fa-e6796f13a5ee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.345044 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqc9f\" (UniqueName: \"kubernetes.io/projected/05f8613f-5927-4897-95fa-e6796f13a5ee-kube-api-access-qqc9f\") pod \"oauth-openshift-749fddbf6b-ff6sg\" (UID: \"05f8613f-5927-4897-95fa-e6796f13a5ee\") " pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.368233 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.448489 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.501075 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-dockercfg-2cfkp\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.524445 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nwglk\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.549955 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-tls\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.603663 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"client-ca\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.612367 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l2v2m\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.613446 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-client\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.630982 5200 kubelet.go:2547] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.631264 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" containerID="cri-o://54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702" gracePeriod=5 Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.710739 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-wzhvk\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.849658 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.885571 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-serving-cert\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.894242 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-scheduler-operator-serving-cert\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.925408 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\"" Nov 26 13:34:34 crc kubenswrapper[5200]: I1126 13:34:34.979221 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e" path="/var/lib/kubelet/pods/50b396e2-4ab8-4be6-a1c6-fe42e7f37e7e/volumes" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.041728 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-4zqgh\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.161782 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"pprof-cert\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.176965 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-global-ca\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.328090 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.439856 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.442015 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.472823 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bgxvm\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.477646 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.503463 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-config\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.616559 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-6n5ln\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.637781 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g6kgg\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.683930 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"etcd-client\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.739197 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.795009 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-serving-cert\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.802142 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.874750 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tk7bt\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.918164 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-images\"" Nov 26 13:34:35 crc kubenswrapper[5200]: I1126 13:34:35.972008 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.018571 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-6tbpn\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.088907 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.115046 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"openshift-service-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.115121 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.135122 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.355268 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"kube-root-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.471072 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.495621 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.496595 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.586724 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.646648 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kpvmz\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.649325 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.694664 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.725463 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:36 crc kubenswrapper[5200]: I1126 13:34:36.817409 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.165303 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.263152 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" event={"ID":"05f8613f-5927-4897-95fa-e6796f13a5ee","Type":"ContainerStarted","Data":"b93875b5c06a00982a94d42bf2c789c3abaedf77481a3a4d72a46ef24c22ba17"} Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.307854 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"config\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.322504 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.345553 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.387327 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-rbac-proxy\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.394239 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"service-ca-bundle\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.513582 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.522605 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.588474 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jmhxf\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.589196 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"trusted-ca\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.652542 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"config\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.776456 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-tls\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.895637 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.898316 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-service-ca-bundle\"" Nov 26 13:34:37 crc kubenswrapper[5200]: I1126 13:34:37.967371 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"package-server-manager-serving-cert\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.000576 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-certs-default\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.073250 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.102727 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.241162 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"serving-cert\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.272802 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" event={"ID":"05f8613f-5927-4897-95fa-e6796f13a5ee","Type":"ContainerStarted","Data":"aec0f82e3d6097857c56371c98205b2d5500a084d16cc7b98f2d3542710d1b26"} Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.273052 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.299367 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" podStartSLOduration=46.299348473 podStartE2EDuration="46.299348473s" podCreationTimestamp="2025-11-26 13:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:34:38.294321091 +0000 UTC m=+246.973522999" watchObservedRunningTime="2025-11-26 13:34:38.299348473 +0000 UTC m=+246.978550301" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.405114 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ac-dockercfg-gj7jx\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.483534 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.489389 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.512895 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.576054 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-749fddbf6b-ff6sg" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.845940 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjqfd\"" Nov 26 13:34:38 crc kubenswrapper[5200]: I1126 13:34:38.884625 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.010984 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.021200 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"client-ca\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.033518 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.173475 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.190114 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.221283 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.423968 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"kube-root-ca.crt\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.532159 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-operator-tls\"" Nov 26 13:34:39 crc kubenswrapper[5200]: I1126 13:34:39.608943 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.105615 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"node-bootstrapper-token\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.222337 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.222449 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.224584 5200 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="pods \"kube-apiserver-startup-monitor-crc\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-kube-apiserver\": no relationship found between node 'crc' and this object" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.285678 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.285737 5200 generic.go:358] "Generic (PLEG): container finished" podID="f7dbc7e1ee9c187a863ef9b473fad27b" containerID="54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702" exitCode=137 Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.285968 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.285969 5200 scope.go:117] "RemoveContainer" containerID="54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.298129 5200 scope.go:117] "RemoveContainer" containerID="54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702" Nov 26 13:34:40 crc kubenswrapper[5200]: E1126 13:34:40.298570 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702\": container with ID starting with 54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702 not found: ID does not exist" containerID="54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.298670 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702"} err="failed to get container status \"54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702\": rpc error: code = NotFound desc = could not find container \"54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702\": container with ID starting with 54a2149bd1152ef7574a231a82975eb8374ca3ebaaa1a7f4d93a65a9f7886702 not found: ID does not exist" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.299096 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.299390 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.299590 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log" (OuterVolumeSpecName: "var-log") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.299489 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.300439 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.300513 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.300549 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.300616 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock" (OuterVolumeSpecName: "var-lock") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.300735 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests" (OuterVolumeSpecName: "manifests") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.301110 5200 reconciler_common.go:299] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.301139 5200 reconciler_common.go:299] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.301155 5200 reconciler_common.go:299] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.301170 5200 reconciler_common.go:299] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.314610 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.376346 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.403597 5200 reconciler_common.go:299] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.606323 5200 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="pods \"kube-apiserver-startup-monitor-crc\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-kube-apiserver\": no relationship found between node 'crc' and this object" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.908435 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"default-dockercfg-hqpm5\"" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.979741 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" path="/var/lib/kubelet/pods/f7dbc7e1ee9c187a863ef9b473fad27b/volumes" Nov 26 13:34:40 crc kubenswrapper[5200]: I1126 13:34:40.989075 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"serving-cert\"" Nov 26 13:34:41 crc kubenswrapper[5200]: I1126 13:34:41.154780 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mdwwj\"" Nov 26 13:34:58 crc kubenswrapper[5200]: I1126 13:34:58.901735 5200 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-gj489 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 26 13:34:58 crc kubenswrapper[5200]: I1126 13:34:58.902263 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 26 13:34:59 crc kubenswrapper[5200]: I1126 13:34:59.429386 5200 generic.go:358] "Generic (PLEG): container finished" podID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerID="fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289" exitCode=0 Nov 26 13:34:59 crc kubenswrapper[5200]: I1126 13:34:59.429516 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" event={"ID":"10b6ac0f-ebd5-49ff-a2d7-f729135c410e","Type":"ContainerDied","Data":"fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289"} Nov 26 13:34:59 crc kubenswrapper[5200]: I1126 13:34:59.430320 5200 scope.go:117] "RemoveContainer" containerID="fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289" Nov 26 13:35:00 crc kubenswrapper[5200]: I1126 13:35:00.435772 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" event={"ID":"10b6ac0f-ebd5-49ff-a2d7-f729135c410e","Type":"ContainerStarted","Data":"cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016"} Nov 26 13:35:00 crc kubenswrapper[5200]: I1126 13:35:00.436557 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:35:00 crc kubenswrapper[5200]: I1126 13:35:00.439475 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:35:01 crc kubenswrapper[5200]: I1126 13:35:01.446245 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:35:01 crc kubenswrapper[5200]: I1126 13:35:01.448461 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Nov 26 13:35:01 crc kubenswrapper[5200]: I1126 13:35:01.448512 5200 generic.go:358] "Generic (PLEG): container finished" podID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerID="773a49b041ecb3acd25da26e90528df4c7e81391a4ed66d60a4cd823bac3f101" exitCode=137 Nov 26 13:35:01 crc kubenswrapper[5200]: I1126 13:35:01.448579 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerDied","Data":"773a49b041ecb3acd25da26e90528df4c7e81391a4ed66d60a4cd823bac3f101"} Nov 26 13:35:01 crc kubenswrapper[5200]: I1126 13:35:01.448727 5200 scope.go:117] "RemoveContainer" containerID="0b3ca5cebf05a72661778d9b20124b225a970b7d0d740153d0c4cd7c496d61dd" Nov 26 13:35:02 crc kubenswrapper[5200]: I1126 13:35:02.459174 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:35:02 crc kubenswrapper[5200]: I1126 13:35:02.461134 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"ab72e4dd2493d6c7a126667e2cbad0a09593c07f402fe84b57285dfd73d0de99"} Nov 26 13:35:03 crc kubenswrapper[5200]: I1126 13:35:03.154619 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:35:03 crc kubenswrapper[5200]: I1126 13:35:03.154989 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:35:03 crc kubenswrapper[5200]: I1126 13:35:03.155153 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:35:03 crc kubenswrapper[5200]: I1126 13:35:03.155950 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a44e89797844111bb8aded93f9b56ff9fb33b44e5ce546ddc59c7e9f7e94a788"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 13:35:03 crc kubenswrapper[5200]: I1126 13:35:03.156140 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://a44e89797844111bb8aded93f9b56ff9fb33b44e5ce546ddc59c7e9f7e94a788" gracePeriod=600 Nov 26 13:35:03 crc kubenswrapper[5200]: I1126 13:35:03.469311 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="a44e89797844111bb8aded93f9b56ff9fb33b44e5ce546ddc59c7e9f7e94a788" exitCode=0 Nov 26 13:35:03 crc kubenswrapper[5200]: I1126 13:35:03.469388 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"a44e89797844111bb8aded93f9b56ff9fb33b44e5ce546ddc59c7e9f7e94a788"} Nov 26 13:35:04 crc kubenswrapper[5200]: I1126 13:35:04.480075 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"e69b8425297c9144715ca780bb68a537c0ee35471dbc6290f21c3cba97e34571"} Nov 26 13:35:11 crc kubenswrapper[5200]: I1126 13:35:11.207617 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:35:11 crc kubenswrapper[5200]: I1126 13:35:11.212606 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:35:11 crc kubenswrapper[5200]: I1126 13:35:11.515121 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:35:12 crc kubenswrapper[5200]: I1126 13:35:12.524156 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.328505 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9"] Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.329148 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" podUID="82a64db7-bf3c-410b-9772-e830bfa83118" containerName="route-controller-manager" containerID="cri-o://fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494" gracePeriod=30 Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.334357 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-h6ns5"] Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.334612 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" podUID="6d7b6fb5-ca34-4fa3-a509-63527956edec" containerName="controller-manager" containerID="cri-o://0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac" gracePeriod=30 Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.767738 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.774462 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.808287 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m"] Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809069 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d7b6fb5-ca34-4fa3-a509-63527956edec" containerName="controller-manager" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809099 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7b6fb5-ca34-4fa3-a509-63527956edec" containerName="controller-manager" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809131 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a64db7-bf3c-410b-9772-e830bfa83118" containerName="route-controller-manager" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809141 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a64db7-bf3c-410b-9772-e830bfa83118" containerName="route-controller-manager" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809150 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809158 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809268 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809288 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d7b6fb5-ca34-4fa3-a509-63527956edec" containerName="controller-manager" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.809300 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="82a64db7-bf3c-410b-9772-e830bfa83118" containerName="route-controller-manager" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814412 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d7b6fb5-ca34-4fa3-a509-63527956edec-tmp\") pod \"6d7b6fb5-ca34-4fa3-a509-63527956edec\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814487 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-config\") pod \"6d7b6fb5-ca34-4fa3-a509-63527956edec\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814518 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-config\") pod \"82a64db7-bf3c-410b-9772-e830bfa83118\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814600 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-client-ca\") pod \"6d7b6fb5-ca34-4fa3-a509-63527956edec\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814646 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d7b6fb5-ca34-4fa3-a509-63527956edec-serving-cert\") pod \"6d7b6fb5-ca34-4fa3-a509-63527956edec\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814700 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82a64db7-bf3c-410b-9772-e830bfa83118-tmp\") pod \"82a64db7-bf3c-410b-9772-e830bfa83118\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814729 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-client-ca\") pod \"82a64db7-bf3c-410b-9772-e830bfa83118\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814760 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzzx5\" (UniqueName: \"kubernetes.io/projected/82a64db7-bf3c-410b-9772-e830bfa83118-kube-api-access-kzzx5\") pod \"82a64db7-bf3c-410b-9772-e830bfa83118\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814806 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2dl4\" (UniqueName: \"kubernetes.io/projected/6d7b6fb5-ca34-4fa3-a509-63527956edec-kube-api-access-d2dl4\") pod \"6d7b6fb5-ca34-4fa3-a509-63527956edec\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814827 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-proxy-ca-bundles\") pod \"6d7b6fb5-ca34-4fa3-a509-63527956edec\" (UID: \"6d7b6fb5-ca34-4fa3-a509-63527956edec\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814894 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82a64db7-bf3c-410b-9772-e830bfa83118-serving-cert\") pod \"82a64db7-bf3c-410b-9772-e830bfa83118\" (UID: \"82a64db7-bf3c-410b-9772-e830bfa83118\") " Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.814950 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d7b6fb5-ca34-4fa3-a509-63527956edec-tmp" (OuterVolumeSpecName: "tmp") pod "6d7b6fb5-ca34-4fa3-a509-63527956edec" (UID: "6d7b6fb5-ca34-4fa3-a509-63527956edec"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.815157 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d7b6fb5-ca34-4fa3-a509-63527956edec-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.815167 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a64db7-bf3c-410b-9772-e830bfa83118-tmp" (OuterVolumeSpecName: "tmp") pod "82a64db7-bf3c-410b-9772-e830bfa83118" (UID: "82a64db7-bf3c-410b-9772-e830bfa83118"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.815307 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-config" (OuterVolumeSpecName: "config") pod "6d7b6fb5-ca34-4fa3-a509-63527956edec" (UID: "6d7b6fb5-ca34-4fa3-a509-63527956edec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.815697 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-config" (OuterVolumeSpecName: "config") pod "82a64db7-bf3c-410b-9772-e830bfa83118" (UID: "82a64db7-bf3c-410b-9772-e830bfa83118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.815850 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-client-ca" (OuterVolumeSpecName: "client-ca") pod "82a64db7-bf3c-410b-9772-e830bfa83118" (UID: "82a64db7-bf3c-410b-9772-e830bfa83118"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.816011 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-client-ca" (OuterVolumeSpecName: "client-ca") pod "6d7b6fb5-ca34-4fa3-a509-63527956edec" (UID: "6d7b6fb5-ca34-4fa3-a509-63527956edec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.816215 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6d7b6fb5-ca34-4fa3-a509-63527956edec" (UID: "6d7b6fb5-ca34-4fa3-a509-63527956edec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.823544 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a64db7-bf3c-410b-9772-e830bfa83118-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "82a64db7-bf3c-410b-9772-e830bfa83118" (UID: "82a64db7-bf3c-410b-9772-e830bfa83118"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.824249 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7b6fb5-ca34-4fa3-a509-63527956edec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6d7b6fb5-ca34-4fa3-a509-63527956edec" (UID: "6d7b6fb5-ca34-4fa3-a509-63527956edec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.824565 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7b6fb5-ca34-4fa3-a509-63527956edec-kube-api-access-d2dl4" (OuterVolumeSpecName: "kube-api-access-d2dl4") pod "6d7b6fb5-ca34-4fa3-a509-63527956edec" (UID: "6d7b6fb5-ca34-4fa3-a509-63527956edec"). InnerVolumeSpecName "kube-api-access-d2dl4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.831943 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a64db7-bf3c-410b-9772-e830bfa83118-kube-api-access-kzzx5" (OuterVolumeSpecName: "kube-api-access-kzzx5") pod "82a64db7-bf3c-410b-9772-e830bfa83118" (UID: "82a64db7-bf3c-410b-9772-e830bfa83118"). InnerVolumeSpecName "kube-api-access-kzzx5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.844086 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m"] Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.844140 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9"] Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.851720 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9"] Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.852058 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.852725 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916178 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5rl9\" (UniqueName: \"kubernetes.io/projected/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-kube-api-access-p5rl9\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916222 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-tmp\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916240 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-client-ca\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916361 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-serving-cert\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916400 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-proxy-ca-bundles\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916474 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjjq2\" (UniqueName: \"kubernetes.io/projected/4db84825-284d-498f-b92b-121750998712-kube-api-access-xjjq2\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916511 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-config\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916537 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-client-ca\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916559 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4db84825-284d-498f-b92b-121750998712-tmp\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916602 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-config\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916652 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4db84825-284d-498f-b92b-121750998712-serving-cert\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916801 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82a64db7-bf3c-410b-9772-e830bfa83118-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916817 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916826 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916834 5200 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916843 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d7b6fb5-ca34-4fa3-a509-63527956edec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916851 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82a64db7-bf3c-410b-9772-e830bfa83118-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916878 5200 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82a64db7-bf3c-410b-9772-e830bfa83118-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916889 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzzx5\" (UniqueName: \"kubernetes.io/projected/82a64db7-bf3c-410b-9772-e830bfa83118-kube-api-access-kzzx5\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916900 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2dl4\" (UniqueName: \"kubernetes.io/projected/6d7b6fb5-ca34-4fa3-a509-63527956edec-kube-api-access-d2dl4\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:20 crc kubenswrapper[5200]: I1126 13:35:20.916908 5200 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6d7b6fb5-ca34-4fa3-a509-63527956edec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018225 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjjq2\" (UniqueName: \"kubernetes.io/projected/4db84825-284d-498f-b92b-121750998712-kube-api-access-xjjq2\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018270 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-config\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018292 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-client-ca\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018308 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4db84825-284d-498f-b92b-121750998712-tmp\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018493 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-config\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018550 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4db84825-284d-498f-b92b-121750998712-serving-cert\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018638 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5rl9\" (UniqueName: \"kubernetes.io/projected/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-kube-api-access-p5rl9\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018705 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-tmp\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018717 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4db84825-284d-498f-b92b-121750998712-tmp\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018724 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-client-ca\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018765 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-serving-cert\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.018789 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-proxy-ca-bundles\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.019411 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-client-ca\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.019475 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-client-ca\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.019744 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-tmp\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.020037 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-config\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.020665 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-config\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.021281 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-proxy-ca-bundles\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.024049 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4db84825-284d-498f-b92b-121750998712-serving-cert\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.025653 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-serving-cert\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.034305 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5rl9\" (UniqueName: \"kubernetes.io/projected/90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5-kube-api-access-p5rl9\") pod \"route-controller-manager-7fddb5bc55-hkbw9\" (UID: \"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5\") " pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.036903 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjjq2\" (UniqueName: \"kubernetes.io/projected/4db84825-284d-498f-b92b-121750998712-kube-api-access-xjjq2\") pod \"controller-manager-f7bc5c4f8-zbp6m\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.118177 5200 generic.go:358] "Generic (PLEG): container finished" podID="82a64db7-bf3c-410b-9772-e830bfa83118" containerID="fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494" exitCode=0 Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.118254 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" event={"ID":"82a64db7-bf3c-410b-9772-e830bfa83118","Type":"ContainerDied","Data":"fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494"} Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.118306 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" event={"ID":"82a64db7-bf3c-410b-9772-e830bfa83118","Type":"ContainerDied","Data":"fc084a296fdbbc4e6740bd3d1b4ec6d4b70cb6d3abd032769acdffe79e49105b"} Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.118328 5200 scope.go:117] "RemoveContainer" containerID="fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.118320 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.119830 5200 generic.go:358] "Generic (PLEG): container finished" podID="6d7b6fb5-ca34-4fa3-a509-63527956edec" containerID="0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac" exitCode=0 Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.119922 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" event={"ID":"6d7b6fb5-ca34-4fa3-a509-63527956edec","Type":"ContainerDied","Data":"0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac"} Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.119949 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" event={"ID":"6d7b6fb5-ca34-4fa3-a509-63527956edec","Type":"ContainerDied","Data":"5aacf521c649b8ca52a3834ee6790db199277207eeb15945a691a089b585de16"} Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.120047 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-h6ns5" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.142357 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9"] Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.158864 5200 scope.go:117] "RemoveContainer" containerID="fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494" Nov 26 13:35:21 crc kubenswrapper[5200]: E1126 13:35:21.160374 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494\": container with ID starting with fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494 not found: ID does not exist" containerID="fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.160432 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494"} err="failed to get container status \"fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494\": rpc error: code = NotFound desc = could not find container \"fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494\": container with ID starting with fc17aad17df221d37c80ae23d7e51989ad7feb138c5d59a923cef6b2c69a5494 not found: ID does not exist" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.160465 5200 scope.go:117] "RemoveContainer" containerID="0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.166292 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-685h9"] Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.174672 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-h6ns5"] Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.177819 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.179558 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-h6ns5"] Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.184746 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.186617 5200 scope.go:117] "RemoveContainer" containerID="0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac" Nov 26 13:35:21 crc kubenswrapper[5200]: E1126 13:35:21.187111 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac\": container with ID starting with 0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac not found: ID does not exist" containerID="0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.187164 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac"} err="failed to get container status \"0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac\": rpc error: code = NotFound desc = could not find container \"0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac\": container with ID starting with 0f02040ad56ae89f5c2c21c924f55dcf0e6941299c6dfd7c7ffe7179d72887ac not found: ID does not exist" Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.404013 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9"] Nov 26 13:35:21 crc kubenswrapper[5200]: W1126 13:35:21.408519 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d2c2c9_76e0_43d9_aeab_b8b5c0b7fca5.slice/crio-5c6dd56727513f1e795c7037ba90c3a051a03c2dfbb2775034544d96ae57173e WatchSource:0}: Error finding container 5c6dd56727513f1e795c7037ba90c3a051a03c2dfbb2775034544d96ae57173e: Status 404 returned error can't find the container with id 5c6dd56727513f1e795c7037ba90c3a051a03c2dfbb2775034544d96ae57173e Nov 26 13:35:21 crc kubenswrapper[5200]: W1126 13:35:21.431605 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4db84825_284d_498f_b92b_121750998712.slice/crio-2435223fa0269f09c6e27126f97a118fcdf204775edab2e3eb30d210f73986b5 WatchSource:0}: Error finding container 2435223fa0269f09c6e27126f97a118fcdf204775edab2e3eb30d210f73986b5: Status 404 returned error can't find the container with id 2435223fa0269f09c6e27126f97a118fcdf204775edab2e3eb30d210f73986b5 Nov 26 13:35:21 crc kubenswrapper[5200]: I1126 13:35:21.432049 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m"] Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.128881 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" event={"ID":"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5","Type":"ContainerStarted","Data":"c4996f6bce67621e67d37a28c5d2d74738ba8757017e1718c3c3480750a308b1"} Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.129136 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.129146 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" event={"ID":"90d2c2c9-76e0-43d9-aeab-b8b5c0b7fca5","Type":"ContainerStarted","Data":"5c6dd56727513f1e795c7037ba90c3a051a03c2dfbb2775034544d96ae57173e"} Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.131145 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" event={"ID":"4db84825-284d-498f-b92b-121750998712","Type":"ContainerStarted","Data":"d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6"} Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.131190 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" event={"ID":"4db84825-284d-498f-b92b-121750998712","Type":"ContainerStarted","Data":"2435223fa0269f09c6e27126f97a118fcdf204775edab2e3eb30d210f73986b5"} Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.131565 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.138494 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.161814 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.173540 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" podStartSLOduration=2.173523284 podStartE2EDuration="2.173523284s" podCreationTimestamp="2025-11-26 13:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:35:22.170721909 +0000 UTC m=+290.849923757" watchObservedRunningTime="2025-11-26 13:35:22.173523284 +0000 UTC m=+290.852725102" Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.175704 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fddb5bc55-hkbw9" podStartSLOduration=2.175679139 podStartE2EDuration="2.175679139s" podCreationTimestamp="2025-11-26 13:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:35:22.154344814 +0000 UTC m=+290.833546652" watchObservedRunningTime="2025-11-26 13:35:22.175679139 +0000 UTC m=+290.854880957" Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.976285 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d7b6fb5-ca34-4fa3-a509-63527956edec" path="/var/lib/kubelet/pods/6d7b6fb5-ca34-4fa3-a509-63527956edec/volumes" Nov 26 13:35:22 crc kubenswrapper[5200]: I1126 13:35:22.976996 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a64db7-bf3c-410b-9772-e830bfa83118" path="/var/lib/kubelet/pods/82a64db7-bf3c-410b-9772-e830bfa83118/volumes" Nov 26 13:35:23 crc kubenswrapper[5200]: I1126 13:35:23.431972 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m"] Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.151007 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" podUID="4db84825-284d-498f-b92b-121750998712" containerName="controller-manager" containerID="cri-o://d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6" gracePeriod=30 Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.542197 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.569099 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6db9644994-dkzjn"] Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.569837 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4db84825-284d-498f-b92b-121750998712" containerName="controller-manager" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.569864 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db84825-284d-498f-b92b-121750998712" containerName="controller-manager" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.570015 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4db84825-284d-498f-b92b-121750998712" containerName="controller-manager" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.584419 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6db9644994-dkzjn"] Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.584559 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.686435 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-config\") pod \"4db84825-284d-498f-b92b-121750998712\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.686528 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-client-ca\") pod \"4db84825-284d-498f-b92b-121750998712\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.686630 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-proxy-ca-bundles\") pod \"4db84825-284d-498f-b92b-121750998712\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.686776 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjjq2\" (UniqueName: \"kubernetes.io/projected/4db84825-284d-498f-b92b-121750998712-kube-api-access-xjjq2\") pod \"4db84825-284d-498f-b92b-121750998712\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.686979 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4db84825-284d-498f-b92b-121750998712-serving-cert\") pod \"4db84825-284d-498f-b92b-121750998712\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687043 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4db84825-284d-498f-b92b-121750998712-tmp\") pod \"4db84825-284d-498f-b92b-121750998712\" (UID: \"4db84825-284d-498f-b92b-121750998712\") " Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687270 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4d6880d-451f-4ced-9a31-f8765677c994-tmp\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687301 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btmbx\" (UniqueName: \"kubernetes.io/projected/d4d6880d-451f-4ced-9a31-f8765677c994-kube-api-access-btmbx\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687368 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4d6880d-451f-4ced-9a31-f8765677c994-serving-cert\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687437 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-config\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687458 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-client-ca\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687468 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4db84825-284d-498f-b92b-121750998712-tmp" (OuterVolumeSpecName: "tmp") pod "4db84825-284d-498f-b92b-121750998712" (UID: "4db84825-284d-498f-b92b-121750998712"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687539 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-client-ca" (OuterVolumeSpecName: "client-ca") pod "4db84825-284d-498f-b92b-121750998712" (UID: "4db84825-284d-498f-b92b-121750998712"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687645 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4db84825-284d-498f-b92b-121750998712" (UID: "4db84825-284d-498f-b92b-121750998712"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687674 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-proxy-ca-bundles\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687867 5200 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687889 5200 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.687904 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4db84825-284d-498f-b92b-121750998712-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.688452 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-config" (OuterVolumeSpecName: "config") pod "4db84825-284d-498f-b92b-121750998712" (UID: "4db84825-284d-498f-b92b-121750998712"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.692673 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4db84825-284d-498f-b92b-121750998712-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4db84825-284d-498f-b92b-121750998712" (UID: "4db84825-284d-498f-b92b-121750998712"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.694804 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4db84825-284d-498f-b92b-121750998712-kube-api-access-xjjq2" (OuterVolumeSpecName: "kube-api-access-xjjq2") pod "4db84825-284d-498f-b92b-121750998712" (UID: "4db84825-284d-498f-b92b-121750998712"). InnerVolumeSpecName "kube-api-access-xjjq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789125 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-proxy-ca-bundles\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789235 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4d6880d-451f-4ced-9a31-f8765677c994-tmp\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789271 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btmbx\" (UniqueName: \"kubernetes.io/projected/d4d6880d-451f-4ced-9a31-f8765677c994-kube-api-access-btmbx\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789321 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4d6880d-451f-4ced-9a31-f8765677c994-serving-cert\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789380 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-config\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789406 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-client-ca\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789517 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db84825-284d-498f-b92b-121750998712-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789536 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xjjq2\" (UniqueName: \"kubernetes.io/projected/4db84825-284d-498f-b92b-121750998712-kube-api-access-xjjq2\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.789552 5200 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4db84825-284d-498f-b92b-121750998712-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.790764 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-proxy-ca-bundles\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.790906 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-client-ca\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.791115 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4d6880d-451f-4ced-9a31-f8765677c994-tmp\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.792432 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4d6880d-451f-4ced-9a31-f8765677c994-config\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.799548 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4d6880d-451f-4ced-9a31-f8765677c994-serving-cert\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.825571 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btmbx\" (UniqueName: \"kubernetes.io/projected/d4d6880d-451f-4ced-9a31-f8765677c994-kube-api-access-btmbx\") pod \"controller-manager-6db9644994-dkzjn\" (UID: \"d4d6880d-451f-4ced-9a31-f8765677c994\") " pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:25 crc kubenswrapper[5200]: I1126 13:35:25.904138 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.089154 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6db9644994-dkzjn"] Nov 26 13:35:26 crc kubenswrapper[5200]: W1126 13:35:26.094764 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d6880d_451f_4ced_9a31_f8765677c994.slice/crio-a1bf6bc0fc180aa9ffd125dd12e67e976b4d0fbea87bd56812d5fc796d6b03b3 WatchSource:0}: Error finding container a1bf6bc0fc180aa9ffd125dd12e67e976b4d0fbea87bd56812d5fc796d6b03b3: Status 404 returned error can't find the container with id a1bf6bc0fc180aa9ffd125dd12e67e976b4d0fbea87bd56812d5fc796d6b03b3 Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.160612 5200 generic.go:358] "Generic (PLEG): container finished" podID="4db84825-284d-498f-b92b-121750998712" containerID="d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6" exitCode=0 Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.160677 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" event={"ID":"4db84825-284d-498f-b92b-121750998712","Type":"ContainerDied","Data":"d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6"} Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.160731 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" event={"ID":"4db84825-284d-498f-b92b-121750998712","Type":"ContainerDied","Data":"2435223fa0269f09c6e27126f97a118fcdf204775edab2e3eb30d210f73986b5"} Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.160751 5200 scope.go:117] "RemoveContainer" containerID="d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6" Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.160773 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m" Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.164449 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" event={"ID":"d4d6880d-451f-4ced-9a31-f8765677c994","Type":"ContainerStarted","Data":"a1bf6bc0fc180aa9ffd125dd12e67e976b4d0fbea87bd56812d5fc796d6b03b3"} Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.203768 5200 scope.go:117] "RemoveContainer" containerID="d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6" Nov 26 13:35:26 crc kubenswrapper[5200]: E1126 13:35:26.204237 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6\": container with ID starting with d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6 not found: ID does not exist" containerID="d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6" Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.204275 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6"} err="failed to get container status \"d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6\": rpc error: code = NotFound desc = could not find container \"d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6\": container with ID starting with d921f0bf2b6b7b66b009e4a88344092987c2828fd9400f46a69c3b8c0550bec6 not found: ID does not exist" Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.218560 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m"] Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.225435 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f7bc5c4f8-zbp6m"] Nov 26 13:35:26 crc kubenswrapper[5200]: I1126 13:35:26.980422 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4db84825-284d-498f-b92b-121750998712" path="/var/lib/kubelet/pods/4db84825-284d-498f-b92b-121750998712/volumes" Nov 26 13:35:27 crc kubenswrapper[5200]: I1126 13:35:27.177609 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" event={"ID":"d4d6880d-451f-4ced-9a31-f8765677c994","Type":"ContainerStarted","Data":"bb82cabcd884bdfeaa1fb22f045e10321f8b603247a41993f6bad10df42d54d4"} Nov 26 13:35:27 crc kubenswrapper[5200]: I1126 13:35:27.177919 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:27 crc kubenswrapper[5200]: I1126 13:35:27.187106 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" Nov 26 13:35:27 crc kubenswrapper[5200]: I1126 13:35:27.202400 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6db9644994-dkzjn" podStartSLOduration=4.2023796 podStartE2EDuration="4.2023796s" podCreationTimestamp="2025-11-26 13:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:35:27.20073644 +0000 UTC m=+295.879938288" watchObservedRunningTime="2025-11-26 13:35:27.2023796 +0000 UTC m=+295.881581428" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.191344 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vhwf"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.192477 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8vhwf" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="registry-server" containerID="cri-o://828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a" gracePeriod=30 Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.221742 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djxct"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.222622 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-djxct" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerName="registry-server" containerID="cri-o://8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef" gracePeriod=30 Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.265011 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-gj489"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.265472 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" containerID="cri-o://cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016" gracePeriod=30 Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.272620 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7dzl"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.275529 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t7dzl" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="registry-server" containerID="cri-o://ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad" gracePeriod=30 Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.301614 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdfrr"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.303277 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cdfrr" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="registry-server" containerID="cri-o://d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845" gracePeriod=30 Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.316490 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-797xh"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.321858 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-797xh"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.321996 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: E1126 13:35:31.479951 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad is running failed: container process not found" containerID="ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.480059 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/21cfe3b7-3913-457e-a487-87372a7b9505-tmp\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.480744 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg7qt\" (UniqueName: \"kubernetes.io/projected/21cfe3b7-3913-457e-a487-87372a7b9505-kube-api-access-qg7qt\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.480909 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21cfe3b7-3913-457e-a487-87372a7b9505-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.480989 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21cfe3b7-3913-457e-a487-87372a7b9505-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: E1126 13:35:31.481224 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad is running failed: container process not found" containerID="ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 13:35:31 crc kubenswrapper[5200]: E1126 13:35:31.482102 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad is running failed: container process not found" containerID="ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 13:35:31 crc kubenswrapper[5200]: E1126 13:35:31.482150 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-t7dzl" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="registry-server" probeResult="unknown" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.581950 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qg7qt\" (UniqueName: \"kubernetes.io/projected/21cfe3b7-3913-457e-a487-87372a7b9505-kube-api-access-qg7qt\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.587022 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21cfe3b7-3913-457e-a487-87372a7b9505-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.587128 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21cfe3b7-3913-457e-a487-87372a7b9505-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.593112 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/21cfe3b7-3913-457e-a487-87372a7b9505-tmp\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.593770 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/21cfe3b7-3913-457e-a487-87372a7b9505-tmp\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.594489 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21cfe3b7-3913-457e-a487-87372a7b9505-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.594983 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21cfe3b7-3913-457e-a487-87372a7b9505-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.599246 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg7qt\" (UniqueName: \"kubernetes.io/projected/21cfe3b7-3913-457e-a487-87372a7b9505-kube-api-access-qg7qt\") pod \"marketplace-operator-547dbd544d-797xh\" (UID: \"21cfe3b7-3913-457e-a487-87372a7b9505\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.649933 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.654706 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.681324 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djxct" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.682194 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.722415 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.758798 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796037 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p4w7\" (UniqueName: \"kubernetes.io/projected/08ffe275-d9f0-491b-bc0a-ae428458c956-kube-api-access-7p4w7\") pod \"08ffe275-d9f0-491b-bc0a-ae428458c956\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796147 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-catalog-content\") pod \"65b299e0-e301-40f7-a74c-d7403b784d36\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796200 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-trusted-ca\") pod \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796219 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-operator-metrics\") pod \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796269 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-utilities\") pod \"08ffe275-d9f0-491b-bc0a-ae428458c956\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796317 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jjrn\" (UniqueName: \"kubernetes.io/projected/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-kube-api-access-6jjrn\") pod \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796332 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-utilities\") pod \"65b299e0-e301-40f7-a74c-d7403b784d36\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796372 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-tmp\") pod \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\" (UID: \"10b6ac0f-ebd5-49ff-a2d7-f729135c410e\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796400 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flc5h\" (UniqueName: \"kubernetes.io/projected/65b299e0-e301-40f7-a74c-d7403b784d36-kube-api-access-flc5h\") pod \"65b299e0-e301-40f7-a74c-d7403b784d36\" (UID: \"65b299e0-e301-40f7-a74c-d7403b784d36\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.796456 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-catalog-content\") pod \"08ffe275-d9f0-491b-bc0a-ae428458c956\" (UID: \"08ffe275-d9f0-491b-bc0a-ae428458c956\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.798256 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "10b6ac0f-ebd5-49ff-a2d7-f729135c410e" (UID: "10b6ac0f-ebd5-49ff-a2d7-f729135c410e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.798539 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-utilities" (OuterVolumeSpecName: "utilities") pod "08ffe275-d9f0-491b-bc0a-ae428458c956" (UID: "08ffe275-d9f0-491b-bc0a-ae428458c956"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.800563 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b299e0-e301-40f7-a74c-d7403b784d36-kube-api-access-flc5h" (OuterVolumeSpecName: "kube-api-access-flc5h") pod "65b299e0-e301-40f7-a74c-d7403b784d36" (UID: "65b299e0-e301-40f7-a74c-d7403b784d36"). InnerVolumeSpecName "kube-api-access-flc5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.800588 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-tmp" (OuterVolumeSpecName: "tmp") pod "10b6ac0f-ebd5-49ff-a2d7-f729135c410e" (UID: "10b6ac0f-ebd5-49ff-a2d7-f729135c410e"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.801060 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ffe275-d9f0-491b-bc0a-ae428458c956-kube-api-access-7p4w7" (OuterVolumeSpecName: "kube-api-access-7p4w7") pod "08ffe275-d9f0-491b-bc0a-ae428458c956" (UID: "08ffe275-d9f0-491b-bc0a-ae428458c956"). InnerVolumeSpecName "kube-api-access-7p4w7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.803503 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-kube-api-access-6jjrn" (OuterVolumeSpecName: "kube-api-access-6jjrn") pod "10b6ac0f-ebd5-49ff-a2d7-f729135c410e" (UID: "10b6ac0f-ebd5-49ff-a2d7-f729135c410e"). InnerVolumeSpecName "kube-api-access-6jjrn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.818871 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-utilities" (OuterVolumeSpecName: "utilities") pod "65b299e0-e301-40f7-a74c-d7403b784d36" (UID: "65b299e0-e301-40f7-a74c-d7403b784d36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.824249 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65b299e0-e301-40f7-a74c-d7403b784d36" (UID: "65b299e0-e301-40f7-a74c-d7403b784d36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.824914 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "10b6ac0f-ebd5-49ff-a2d7-f729135c410e" (UID: "10b6ac0f-ebd5-49ff-a2d7-f729135c410e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.864201 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08ffe275-d9f0-491b-bc0a-ae428458c956" (UID: "08ffe275-d9f0-491b-bc0a-ae428458c956"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.898637 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvx89\" (UniqueName: \"kubernetes.io/projected/a8c12d33-9b04-4eab-9f47-289246106a6f-kube-api-access-jvx89\") pod \"a8c12d33-9b04-4eab-9f47-289246106a6f\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.899340 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-utilities\") pod \"aaf837ae-040e-49b2-afff-9c189bbe80e8\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.899468 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fszn\" (UniqueName: \"kubernetes.io/projected/aaf837ae-040e-49b2-afff-9c189bbe80e8-kube-api-access-6fszn\") pod \"aaf837ae-040e-49b2-afff-9c189bbe80e8\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.899570 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-catalog-content\") pod \"aaf837ae-040e-49b2-afff-9c189bbe80e8\" (UID: \"aaf837ae-040e-49b2-afff-9c189bbe80e8\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.899768 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-utilities\") pod \"a8c12d33-9b04-4eab-9f47-289246106a6f\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.899945 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-catalog-content\") pod \"a8c12d33-9b04-4eab-9f47-289246106a6f\" (UID: \"a8c12d33-9b04-4eab-9f47-289246106a6f\") " Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.900359 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7p4w7\" (UniqueName: \"kubernetes.io/projected/08ffe275-d9f0-491b-bc0a-ae428458c956-kube-api-access-7p4w7\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.900488 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.900595 5200 reconciler_common.go:299] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.900722 5200 reconciler_common.go:299] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.900836 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.900940 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jjrn\" (UniqueName: \"kubernetes.io/projected/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-kube-api-access-6jjrn\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.901066 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65b299e0-e301-40f7-a74c-d7403b784d36-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.901176 5200 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b6ac0f-ebd5-49ff-a2d7-f729135c410e-tmp\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.901276 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-flc5h\" (UniqueName: \"kubernetes.io/projected/65b299e0-e301-40f7-a74c-d7403b784d36-kube-api-access-flc5h\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.901402 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ffe275-d9f0-491b-bc0a-ae428458c956-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.900597 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-utilities" (OuterVolumeSpecName: "utilities") pod "aaf837ae-040e-49b2-afff-9c189bbe80e8" (UID: "aaf837ae-040e-49b2-afff-9c189bbe80e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.902856 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-utilities" (OuterVolumeSpecName: "utilities") pod "a8c12d33-9b04-4eab-9f47-289246106a6f" (UID: "a8c12d33-9b04-4eab-9f47-289246106a6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.902927 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf837ae-040e-49b2-afff-9c189bbe80e8-kube-api-access-6fszn" (OuterVolumeSpecName: "kube-api-access-6fszn") pod "aaf837ae-040e-49b2-afff-9c189bbe80e8" (UID: "aaf837ae-040e-49b2-afff-9c189bbe80e8"). InnerVolumeSpecName "kube-api-access-6fszn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.908242 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c12d33-9b04-4eab-9f47-289246106a6f-kube-api-access-jvx89" (OuterVolumeSpecName: "kube-api-access-jvx89") pod "a8c12d33-9b04-4eab-9f47-289246106a6f" (UID: "a8c12d33-9b04-4eab-9f47-289246106a6f"). InnerVolumeSpecName "kube-api-access-jvx89". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.912078 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8c12d33-9b04-4eab-9f47-289246106a6f" (UID: "a8c12d33-9b04-4eab-9f47-289246106a6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.920109 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-797xh"] Nov 26 13:35:31 crc kubenswrapper[5200]: I1126 13:35:31.982965 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aaf837ae-040e-49b2-afff-9c189bbe80e8" (UID: "aaf837ae-040e-49b2-afff-9c189bbe80e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.002243 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.002281 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c12d33-9b04-4eab-9f47-289246106a6f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.002295 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvx89\" (UniqueName: \"kubernetes.io/projected/a8c12d33-9b04-4eab-9f47-289246106a6f-kube-api-access-jvx89\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.002307 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.002318 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6fszn\" (UniqueName: \"kubernetes.io/projected/aaf837ae-040e-49b2-afff-9c189bbe80e8-kube-api-access-6fszn\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.002328 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf837ae-040e-49b2-afff-9c189bbe80e8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.249600 5200 generic.go:358] "Generic (PLEG): container finished" podID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerID="d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845" exitCode=0 Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.249744 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdfrr" event={"ID":"aaf837ae-040e-49b2-afff-9c189bbe80e8","Type":"ContainerDied","Data":"d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.249782 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdfrr" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.249839 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdfrr" event={"ID":"aaf837ae-040e-49b2-afff-9c189bbe80e8","Type":"ContainerDied","Data":"9b795e45f35205933c1f175f89da04de009bcf8194e6224a098ed9c380b9bc2f"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.249874 5200 scope.go:117] "RemoveContainer" containerID="d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.253002 5200 generic.go:358] "Generic (PLEG): container finished" podID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerID="ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad" exitCode=0 Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.253055 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7dzl" event={"ID":"a8c12d33-9b04-4eab-9f47-289246106a6f","Type":"ContainerDied","Data":"ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.253127 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t7dzl" event={"ID":"a8c12d33-9b04-4eab-9f47-289246106a6f","Type":"ContainerDied","Data":"1e0c479495a567c7d9c9b69a933be74573280e99f85b17e26c22e56fcd24d1ac"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.253143 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t7dzl" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.255729 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" event={"ID":"21cfe3b7-3913-457e-a487-87372a7b9505","Type":"ContainerStarted","Data":"c0976c0c75881f28c20f46d7bacf704c51a59b3f85e1fc552908a386010fe4d0"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.255772 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" event={"ID":"21cfe3b7-3913-457e-a487-87372a7b9505","Type":"ContainerStarted","Data":"7d7e54db5bbbc25e4c1ef0addea27105b73f9a1c36a0e1433940a815230c0f1c"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.255972 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.257545 5200 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-797xh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.62:8080/healthz\": dial tcp 10.217.0.62:8080: connect: connection refused" start-of-body= Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.257631 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" podUID="21cfe3b7-3913-457e-a487-87372a7b9505" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.62:8080/healthz\": dial tcp 10.217.0.62:8080: connect: connection refused" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.259127 5200 generic.go:358] "Generic (PLEG): container finished" podID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerID="cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016" exitCode=0 Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.259220 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" event={"ID":"10b6ac0f-ebd5-49ff-a2d7-f729135c410e","Type":"ContainerDied","Data":"cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.259249 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" event={"ID":"10b6ac0f-ebd5-49ff-a2d7-f729135c410e","Type":"ContainerDied","Data":"fb9b1c2ebaf22a1777f90baca390a0cc513ff626386ff319fa5e28643e0daf3a"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.259297 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-gj489" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.265029 5200 generic.go:358] "Generic (PLEG): container finished" podID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerID="8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef" exitCode=0 Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.265163 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djxct" event={"ID":"08ffe275-d9f0-491b-bc0a-ae428458c956","Type":"ContainerDied","Data":"8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.265211 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djxct" event={"ID":"08ffe275-d9f0-491b-bc0a-ae428458c956","Type":"ContainerDied","Data":"88322b41c70e7aba7830c95b94424a0d6724b61215357c1cd9cb00408b1dab7a"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.265385 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djxct" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.269185 5200 generic.go:358] "Generic (PLEG): container finished" podID="65b299e0-e301-40f7-a74c-d7403b784d36" containerID="828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a" exitCode=0 Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.269338 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhwf" event={"ID":"65b299e0-e301-40f7-a74c-d7403b784d36","Type":"ContainerDied","Data":"828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.269371 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhwf" event={"ID":"65b299e0-e301-40f7-a74c-d7403b784d36","Type":"ContainerDied","Data":"66756543a65080fcaedd6021ef3c750ff483d963545925747d12306dcc8f476a"} Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.269464 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vhwf" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.277640 5200 scope.go:117] "RemoveContainer" containerID="72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.295544 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" podStartSLOduration=1.295521259 podStartE2EDuration="1.295521259s" podCreationTimestamp="2025-11-26 13:35:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:35:32.282827675 +0000 UTC m=+300.962029533" watchObservedRunningTime="2025-11-26 13:35:32.295521259 +0000 UTC m=+300.974723077" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.311607 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdfrr"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.315437 5200 scope.go:117] "RemoveContainer" containerID="3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.315460 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cdfrr"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.338593 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7dzl"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.341960 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t7dzl"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.346617 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djxct"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.349191 5200 scope.go:117] "RemoveContainer" containerID="d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.349645 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845\": container with ID starting with d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845 not found: ID does not exist" containerID="d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.349712 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845"} err="failed to get container status \"d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845\": rpc error: code = NotFound desc = could not find container \"d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845\": container with ID starting with d1642508e7333e9e4050df75980a12329bdfe9ff0c5f3869609cc1d5a5241845 not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.349747 5200 scope.go:117] "RemoveContainer" containerID="72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.350081 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-djxct"] Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.350092 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704\": container with ID starting with 72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704 not found: ID does not exist" containerID="72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.350449 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704"} err="failed to get container status \"72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704\": rpc error: code = NotFound desc = could not find container \"72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704\": container with ID starting with 72f2b942c2944f718a7c2f312ee1d79847a60e28d85756c7024b8267afa7a704 not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.350562 5200 scope.go:117] "RemoveContainer" containerID="3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.350891 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a\": container with ID starting with 3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a not found: ID does not exist" containerID="3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.350931 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a"} err="failed to get container status \"3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a\": rpc error: code = NotFound desc = could not find container \"3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a\": container with ID starting with 3f6a21a09d6929534745f6db5b7badaf7dac0a3141935854ad37a6dd0497746a not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.350949 5200 scope.go:117] "RemoveContainer" containerID="ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.354608 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vhwf"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.357432 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8vhwf"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.363880 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-gj489"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.372646 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-gj489"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.391259 5200 scope.go:117] "RemoveContainer" containerID="446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.394496 5200 cert_rotation.go:92] "Certificate rotation detected, shutting down client connections to start using new credentials" logger="tls-transport-cache" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.411165 5200 scope.go:117] "RemoveContainer" containerID="347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.439869 5200 scope.go:117] "RemoveContainer" containerID="ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.440366 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad\": container with ID starting with ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad not found: ID does not exist" containerID="ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.440485 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad"} err="failed to get container status \"ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad\": rpc error: code = NotFound desc = could not find container \"ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad\": container with ID starting with ddd2bec6d30e65fe134ab6d0321f502686db6f13cb0bb8d06895f6fb883683ad not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.440605 5200 scope.go:117] "RemoveContainer" containerID="446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.441173 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6\": container with ID starting with 446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6 not found: ID does not exist" containerID="446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.441306 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6"} err="failed to get container status \"446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6\": rpc error: code = NotFound desc = could not find container \"446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6\": container with ID starting with 446d70eb5279109404c371878726278f179061a84ae1ffdb9215621ed0611fb6 not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.441430 5200 scope.go:117] "RemoveContainer" containerID="347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.441841 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81\": container with ID starting with 347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81 not found: ID does not exist" containerID="347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.441879 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81"} err="failed to get container status \"347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81\": rpc error: code = NotFound desc = could not find container \"347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81\": container with ID starting with 347d321f850f65324a60297d71ac3d581bf9a9699390e839a349db8fd10a9e81 not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.441900 5200 scope.go:117] "RemoveContainer" containerID="cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.456414 5200 scope.go:117] "RemoveContainer" containerID="fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.479429 5200 scope.go:117] "RemoveContainer" containerID="cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.480201 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016\": container with ID starting with cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016 not found: ID does not exist" containerID="cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.480231 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016"} err="failed to get container status \"cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016\": rpc error: code = NotFound desc = could not find container \"cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016\": container with ID starting with cd105dc41be4194d34032f0dd64dee393ee9189c1166894054cceef719c97016 not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.480290 5200 scope.go:117] "RemoveContainer" containerID="fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.480644 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289\": container with ID starting with fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289 not found: ID does not exist" containerID="fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.480697 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289"} err="failed to get container status \"fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289\": rpc error: code = NotFound desc = could not find container \"fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289\": container with ID starting with fe82c600f950051a31c10ad5a66cd38492c125671ef50cf84fac420e05780289 not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.480726 5200 scope.go:117] "RemoveContainer" containerID="8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.503071 5200 scope.go:117] "RemoveContainer" containerID="5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.543113 5200 scope.go:117] "RemoveContainer" containerID="6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.567074 5200 scope.go:117] "RemoveContainer" containerID="8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.567561 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef\": container with ID starting with 8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef not found: ID does not exist" containerID="8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.567594 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef"} err="failed to get container status \"8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef\": rpc error: code = NotFound desc = could not find container \"8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef\": container with ID starting with 8ca2e0c414a46a57e146c6fee416c6c359e01f4100e978b972faba19d0534aef not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.567619 5200 scope.go:117] "RemoveContainer" containerID="5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.567982 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a\": container with ID starting with 5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a not found: ID does not exist" containerID="5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.568005 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a"} err="failed to get container status \"5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a\": rpc error: code = NotFound desc = could not find container \"5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a\": container with ID starting with 5ebf1c761e3f0be2cc9c394bbeba17649eb1b40355043c189510cb762ea3960a not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.568018 5200 scope.go:117] "RemoveContainer" containerID="6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.568319 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e\": container with ID starting with 6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e not found: ID does not exist" containerID="6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.568338 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e"} err="failed to get container status \"6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e\": rpc error: code = NotFound desc = could not find container \"6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e\": container with ID starting with 6da8451d17e630018597d93eb56384ec69f0f36700b37b643efc3dd565a9342e not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.568349 5200 scope.go:117] "RemoveContainer" containerID="828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.584042 5200 scope.go:117] "RemoveContainer" containerID="abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.653986 5200 scope.go:117] "RemoveContainer" containerID="98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.664761 5200 scope.go:117] "RemoveContainer" containerID="828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.665221 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a\": container with ID starting with 828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a not found: ID does not exist" containerID="828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.665308 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a"} err="failed to get container status \"828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a\": rpc error: code = NotFound desc = could not find container \"828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a\": container with ID starting with 828346515c651cc62420b96dd3571c2126ead77fe00d4a2b041796aa8f99a63a not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.665392 5200 scope.go:117] "RemoveContainer" containerID="abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.665719 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250\": container with ID starting with abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250 not found: ID does not exist" containerID="abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.665742 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250"} err="failed to get container status \"abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250\": rpc error: code = NotFound desc = could not find container \"abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250\": container with ID starting with abc5aea190acb4ed2f7784bfc379d5b618175b706ad1ba98be23b724e5db5250 not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.665756 5200 scope.go:117] "RemoveContainer" containerID="98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770" Nov 26 13:35:32 crc kubenswrapper[5200]: E1126 13:35:32.666121 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770\": container with ID starting with 98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770 not found: ID does not exist" containerID="98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.666257 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770"} err="failed to get container status \"98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770\": rpc error: code = NotFound desc = could not find container \"98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770\": container with ID starting with 98b6fd94190fe76817a44d5ed134376ff10e8aae3d151c955dd313a4116f0770 not found: ID does not exist" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.806252 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wdpbr"] Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807189 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="extract-utilities" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807221 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="extract-utilities" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807237 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807248 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807266 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807278 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807297 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807307 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807321 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807331 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807350 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerName="extract-utilities" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807360 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerName="extract-utilities" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807374 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="extract-content" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807384 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="extract-content" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807397 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="extract-content" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807408 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="extract-content" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807421 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="extract-utilities" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807431 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="extract-utilities" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807451 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerName="extract-content" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807461 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerName="extract-content" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807475 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807485 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807495 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807505 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807532 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="extract-content" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807542 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="extract-content" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807558 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="extract-utilities" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807569 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="extract-utilities" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807718 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807737 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" containerName="marketplace-operator" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807751 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807764 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807783 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" containerName="registry-server" Nov 26 13:35:32 crc kubenswrapper[5200]: I1126 13:35:32.807799 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" containerName="registry-server" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.090440 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.093890 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.098679 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ffe275-d9f0-491b-bc0a-ae428458c956" path="/var/lib/kubelet/pods/08ffe275-d9f0-491b-bc0a-ae428458c956/volumes" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.100260 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b6ac0f-ebd5-49ff-a2d7-f729135c410e" path="/var/lib/kubelet/pods/10b6ac0f-ebd5-49ff-a2d7-f729135c410e/volumes" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.102258 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b299e0-e301-40f7-a74c-d7403b784d36" path="/var/lib/kubelet/pods/65b299e0-e301-40f7-a74c-d7403b784d36/volumes" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.104646 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c12d33-9b04-4eab-9f47-289246106a6f" path="/var/lib/kubelet/pods/a8c12d33-9b04-4eab-9f47-289246106a6f/volumes" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.106388 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf837ae-040e-49b2-afff-9c189bbe80e8" path="/var/lib/kubelet/pods/aaf837ae-040e-49b2-afff-9c189bbe80e8/volumes" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.108481 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wdpbr"] Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.218629 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-catalog-content\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.218736 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-utilities\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.218789 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwb2c\" (UniqueName: \"kubernetes.io/projected/395d7072-6b17-40a6-9969-3a5ddf3de469-kube-api-access-jwb2c\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.288597 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-797xh" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.320105 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-utilities\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.320195 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwb2c\" (UniqueName: \"kubernetes.io/projected/395d7072-6b17-40a6-9969-3a5ddf3de469-kube-api-access-jwb2c\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.320272 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-catalog-content\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.320936 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-catalog-content\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.321382 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-utilities\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.371646 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwb2c\" (UniqueName: \"kubernetes.io/projected/395d7072-6b17-40a6-9969-3a5ddf3de469-kube-api-access-jwb2c\") pod \"certified-operators-wdpbr\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.411822 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.486153 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.487207 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:35:33 crc kubenswrapper[5200]: I1126 13:35:33.850833 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wdpbr"] Nov 26 13:35:33 crc kubenswrapper[5200]: W1126 13:35:33.858620 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395d7072_6b17_40a6_9969_3a5ddf3de469.slice/crio-1a1e2ab9a26a4f9b08c5b12194eae8b335351a3b8d7a98238bfa42a8aaef91ae WatchSource:0}: Error finding container 1a1e2ab9a26a4f9b08c5b12194eae8b335351a3b8d7a98238bfa42a8aaef91ae: Status 404 returned error can't find the container with id 1a1e2ab9a26a4f9b08c5b12194eae8b335351a3b8d7a98238bfa42a8aaef91ae Nov 26 13:35:34 crc kubenswrapper[5200]: E1126 13:35:34.141295 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395d7072_6b17_40a6_9969_3a5ddf3de469.slice/crio-8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395d7072_6b17_40a6_9969_3a5ddf3de469.slice/crio-conmon-8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4.scope\": RecentStats: unable to find data in memory cache]" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.210800 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4f7tr"] Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.243673 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f7tr"] Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.243975 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.246544 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.292906 5200 generic.go:358] "Generic (PLEG): container finished" podID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerID="8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4" exitCode=0 Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.293085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdpbr" event={"ID":"395d7072-6b17-40a6-9969-3a5ddf3de469","Type":"ContainerDied","Data":"8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4"} Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.293152 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdpbr" event={"ID":"395d7072-6b17-40a6-9969-3a5ddf3de469","Type":"ContainerStarted","Data":"1a1e2ab9a26a4f9b08c5b12194eae8b335351a3b8d7a98238bfa42a8aaef91ae"} Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.294528 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.336643 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/170ad884-56d8-4fe6-8009-871a11ea9e9a-catalog-content\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.337305 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/170ad884-56d8-4fe6-8009-871a11ea9e9a-utilities\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.337371 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncgkg\" (UniqueName: \"kubernetes.io/projected/170ad884-56d8-4fe6-8009-871a11ea9e9a-kube-api-access-ncgkg\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.438828 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/170ad884-56d8-4fe6-8009-871a11ea9e9a-utilities\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.438940 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncgkg\" (UniqueName: \"kubernetes.io/projected/170ad884-56d8-4fe6-8009-871a11ea9e9a-kube-api-access-ncgkg\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.439418 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/170ad884-56d8-4fe6-8009-871a11ea9e9a-utilities\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.439555 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/170ad884-56d8-4fe6-8009-871a11ea9e9a-catalog-content\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.441007 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/170ad884-56d8-4fe6-8009-871a11ea9e9a-catalog-content\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.465273 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncgkg\" (UniqueName: \"kubernetes.io/projected/170ad884-56d8-4fe6-8009-871a11ea9e9a-kube-api-access-ncgkg\") pod \"redhat-marketplace-4f7tr\" (UID: \"170ad884-56d8-4fe6-8009-871a11ea9e9a\") " pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:34 crc kubenswrapper[5200]: I1126 13:35:34.566838 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.018523 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f7tr"] Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.211291 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j2dll"] Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.217180 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.219116 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.223556 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j2dll"] Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.305438 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdpbr" event={"ID":"395d7072-6b17-40a6-9969-3a5ddf3de469","Type":"ContainerStarted","Data":"2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd"} Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.306665 5200 generic.go:358] "Generic (PLEG): container finished" podID="170ad884-56d8-4fe6-8009-871a11ea9e9a" containerID="4bf1e563dcc77fa114721335de40c629f29dce7675b7e40642dffbd55f9256ba" exitCode=0 Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.306790 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f7tr" event={"ID":"170ad884-56d8-4fe6-8009-871a11ea9e9a","Type":"ContainerDied","Data":"4bf1e563dcc77fa114721335de40c629f29dce7675b7e40642dffbd55f9256ba"} Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.306835 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f7tr" event={"ID":"170ad884-56d8-4fe6-8009-871a11ea9e9a","Type":"ContainerStarted","Data":"bfb19bad4a5c4254b6d6a7b006f972bcdc1a2cee8491e9f8dba8d60d2b765766"} Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.350090 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flh54\" (UniqueName: \"kubernetes.io/projected/4aa9113c-1a68-4764-ad6f-2c0b804e194b-kube-api-access-flh54\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.350177 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa9113c-1a68-4764-ad6f-2c0b804e194b-catalog-content\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.350204 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa9113c-1a68-4764-ad6f-2c0b804e194b-utilities\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.451333 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa9113c-1a68-4764-ad6f-2c0b804e194b-catalog-content\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.451803 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa9113c-1a68-4764-ad6f-2c0b804e194b-utilities\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.451827 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa9113c-1a68-4764-ad6f-2c0b804e194b-catalog-content\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.451860 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flh54\" (UniqueName: \"kubernetes.io/projected/4aa9113c-1a68-4764-ad6f-2c0b804e194b-kube-api-access-flh54\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.452123 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa9113c-1a68-4764-ad6f-2c0b804e194b-utilities\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.495119 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flh54\" (UniqueName: \"kubernetes.io/projected/4aa9113c-1a68-4764-ad6f-2c0b804e194b-kube-api-access-flh54\") pod \"redhat-operators-j2dll\" (UID: \"4aa9113c-1a68-4764-ad6f-2c0b804e194b\") " pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:35 crc kubenswrapper[5200]: I1126 13:35:35.601070 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.045041 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j2dll"] Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.317388 5200 generic.go:358] "Generic (PLEG): container finished" podID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerID="2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd" exitCode=0 Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.317527 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdpbr" event={"ID":"395d7072-6b17-40a6-9969-3a5ddf3de469","Type":"ContainerDied","Data":"2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd"} Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.323185 5200 generic.go:358] "Generic (PLEG): container finished" podID="4aa9113c-1a68-4764-ad6f-2c0b804e194b" containerID="2fab77e1ee4e16343ca208ddf04f6a4c728809bcc1d75be47d1a16c12d916b4c" exitCode=0 Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.323289 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2dll" event={"ID":"4aa9113c-1a68-4764-ad6f-2c0b804e194b","Type":"ContainerDied","Data":"2fab77e1ee4e16343ca208ddf04f6a4c728809bcc1d75be47d1a16c12d916b4c"} Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.323321 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2dll" event={"ID":"4aa9113c-1a68-4764-ad6f-2c0b804e194b","Type":"ContainerStarted","Data":"fe821d05acba529282be391275230d3fc0a1c7979062892776041910e4f382e0"} Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.611282 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nxr6n"] Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.628714 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxr6n"] Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.628908 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.632633 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.779579 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-utilities\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.779663 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7z5l\" (UniqueName: \"kubernetes.io/projected/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-kube-api-access-c7z5l\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.779713 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-catalog-content\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.880873 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7z5l\" (UniqueName: \"kubernetes.io/projected/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-kube-api-access-c7z5l\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.881231 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-catalog-content\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.881422 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-utilities\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.882257 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-utilities\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.882309 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-catalog-content\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.907124 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7z5l\" (UniqueName: \"kubernetes.io/projected/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-kube-api-access-c7z5l\") pod \"community-operators-nxr6n\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:36 crc kubenswrapper[5200]: I1126 13:35:36.954371 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:37 crc kubenswrapper[5200]: I1126 13:35:37.332314 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdpbr" event={"ID":"395d7072-6b17-40a6-9969-3a5ddf3de469","Type":"ContainerStarted","Data":"2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c"} Nov 26 13:35:37 crc kubenswrapper[5200]: I1126 13:35:37.336724 5200 generic.go:358] "Generic (PLEG): container finished" podID="170ad884-56d8-4fe6-8009-871a11ea9e9a" containerID="fa705b6417339f99aef5922881b985f8da42d8432619e7cdcd77905136c51815" exitCode=0 Nov 26 13:35:37 crc kubenswrapper[5200]: I1126 13:35:37.338011 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f7tr" event={"ID":"170ad884-56d8-4fe6-8009-871a11ea9e9a","Type":"ContainerDied","Data":"fa705b6417339f99aef5922881b985f8da42d8432619e7cdcd77905136c51815"} Nov 26 13:35:37 crc kubenswrapper[5200]: I1126 13:35:37.357040 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wdpbr" podStartSLOduration=4.5471260319999995 podStartE2EDuration="5.357016873s" podCreationTimestamp="2025-11-26 13:35:32 +0000 UTC" firstStartedPulling="2025-11-26 13:35:34.29648834 +0000 UTC m=+302.975690188" lastFinishedPulling="2025-11-26 13:35:35.106379201 +0000 UTC m=+303.785581029" observedRunningTime="2025-11-26 13:35:37.350115714 +0000 UTC m=+306.029317542" watchObservedRunningTime="2025-11-26 13:35:37.357016873 +0000 UTC m=+306.036218691" Nov 26 13:35:37 crc kubenswrapper[5200]: I1126 13:35:37.467128 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxr6n"] Nov 26 13:35:37 crc kubenswrapper[5200]: W1126 13:35:37.467895 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c896c5_a00d_4794_9c6b_0f4e7eefb79c.slice/crio-87a61c420e19e4e9c5deb9794f84d61bc71ae4990d5af408c031a335819cfa92 WatchSource:0}: Error finding container 87a61c420e19e4e9c5deb9794f84d61bc71ae4990d5af408c031a335819cfa92: Status 404 returned error can't find the container with id 87a61c420e19e4e9c5deb9794f84d61bc71ae4990d5af408c031a335819cfa92 Nov 26 13:35:38 crc kubenswrapper[5200]: I1126 13:35:38.348669 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f7tr" event={"ID":"170ad884-56d8-4fe6-8009-871a11ea9e9a","Type":"ContainerStarted","Data":"bdd14a3fce112a3ae05b4274d61c21b5121be2578516e79bbe2c4f38b334bf2e"} Nov 26 13:35:38 crc kubenswrapper[5200]: I1126 13:35:38.349967 5200 generic.go:358] "Generic (PLEG): container finished" podID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerID="77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea" exitCode=0 Nov 26 13:35:38 crc kubenswrapper[5200]: I1126 13:35:38.351085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxr6n" event={"ID":"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c","Type":"ContainerDied","Data":"77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea"} Nov 26 13:35:38 crc kubenswrapper[5200]: I1126 13:35:38.351126 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxr6n" event={"ID":"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c","Type":"ContainerStarted","Data":"87a61c420e19e4e9c5deb9794f84d61bc71ae4990d5af408c031a335819cfa92"} Nov 26 13:35:38 crc kubenswrapper[5200]: I1126 13:35:38.374706 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4f7tr" podStartSLOduration=2.68169205 podStartE2EDuration="4.374663377s" podCreationTimestamp="2025-11-26 13:35:34 +0000 UTC" firstStartedPulling="2025-11-26 13:35:35.307440392 +0000 UTC m=+303.986642210" lastFinishedPulling="2025-11-26 13:35:37.000411709 +0000 UTC m=+305.679613537" observedRunningTime="2025-11-26 13:35:38.368949054 +0000 UTC m=+307.048150872" watchObservedRunningTime="2025-11-26 13:35:38.374663377 +0000 UTC m=+307.053865205" Nov 26 13:35:39 crc kubenswrapper[5200]: I1126 13:35:39.358987 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2dll" event={"ID":"4aa9113c-1a68-4764-ad6f-2c0b804e194b","Type":"ContainerStarted","Data":"a3b7213cfb0cff1a6ba06832b1c47083b4fcee88971a6139dea510f3e710cb82"} Nov 26 13:35:40 crc kubenswrapper[5200]: I1126 13:35:40.367059 5200 generic.go:358] "Generic (PLEG): container finished" podID="4aa9113c-1a68-4764-ad6f-2c0b804e194b" containerID="a3b7213cfb0cff1a6ba06832b1c47083b4fcee88971a6139dea510f3e710cb82" exitCode=0 Nov 26 13:35:40 crc kubenswrapper[5200]: I1126 13:35:40.367241 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2dll" event={"ID":"4aa9113c-1a68-4764-ad6f-2c0b804e194b","Type":"ContainerDied","Data":"a3b7213cfb0cff1a6ba06832b1c47083b4fcee88971a6139dea510f3e710cb82"} Nov 26 13:35:40 crc kubenswrapper[5200]: I1126 13:35:40.369987 5200 generic.go:358] "Generic (PLEG): container finished" podID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerID="fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e" exitCode=0 Nov 26 13:35:40 crc kubenswrapper[5200]: I1126 13:35:40.370041 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxr6n" event={"ID":"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c","Type":"ContainerDied","Data":"fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e"} Nov 26 13:35:41 crc kubenswrapper[5200]: I1126 13:35:41.378839 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxr6n" event={"ID":"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c","Type":"ContainerStarted","Data":"6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d"} Nov 26 13:35:41 crc kubenswrapper[5200]: I1126 13:35:41.382368 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2dll" event={"ID":"4aa9113c-1a68-4764-ad6f-2c0b804e194b","Type":"ContainerStarted","Data":"f93f96ba80ff1bfff3d58bab2d8983787dd341abdb4abe0ae9238bad3437ef45"} Nov 26 13:35:41 crc kubenswrapper[5200]: I1126 13:35:41.418786 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j2dll" podStartSLOduration=4.071141449 podStartE2EDuration="6.418772133s" podCreationTimestamp="2025-11-26 13:35:35 +0000 UTC" firstStartedPulling="2025-11-26 13:35:36.324523089 +0000 UTC m=+305.003724937" lastFinishedPulling="2025-11-26 13:35:38.672153763 +0000 UTC m=+307.351355621" observedRunningTime="2025-11-26 13:35:41.416450113 +0000 UTC m=+310.095651941" watchObservedRunningTime="2025-11-26 13:35:41.418772133 +0000 UTC m=+310.097973951" Nov 26 13:35:41 crc kubenswrapper[5200]: I1126 13:35:41.418886 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nxr6n" podStartSLOduration=4.115227904 podStartE2EDuration="5.418880666s" podCreationTimestamp="2025-11-26 13:35:36 +0000 UTC" firstStartedPulling="2025-11-26 13:35:38.35193095 +0000 UTC m=+307.031132768" lastFinishedPulling="2025-11-26 13:35:39.655583712 +0000 UTC m=+308.334785530" observedRunningTime="2025-11-26 13:35:41.398243662 +0000 UTC m=+310.077445500" watchObservedRunningTime="2025-11-26 13:35:41.418880666 +0000 UTC m=+310.098082484" Nov 26 13:35:43 crc kubenswrapper[5200]: I1126 13:35:43.412641 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:43 crc kubenswrapper[5200]: I1126 13:35:43.413233 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:43 crc kubenswrapper[5200]: I1126 13:35:43.471046 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:44 crc kubenswrapper[5200]: I1126 13:35:44.471512 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 13:35:44 crc kubenswrapper[5200]: I1126 13:35:44.567363 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:44 crc kubenswrapper[5200]: I1126 13:35:44.567611 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:44 crc kubenswrapper[5200]: I1126 13:35:44.615779 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:45 crc kubenswrapper[5200]: I1126 13:35:45.456611 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4f7tr" Nov 26 13:35:45 crc kubenswrapper[5200]: I1126 13:35:45.603643 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:45 crc kubenswrapper[5200]: I1126 13:35:45.603744 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:45 crc kubenswrapper[5200]: I1126 13:35:45.653599 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:46 crc kubenswrapper[5200]: I1126 13:35:46.479471 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j2dll" Nov 26 13:35:46 crc kubenswrapper[5200]: I1126 13:35:46.954900 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:46 crc kubenswrapper[5200]: I1126 13:35:46.956101 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:47 crc kubenswrapper[5200]: I1126 13:35:47.009879 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:35:47 crc kubenswrapper[5200]: I1126 13:35:47.472801 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nxr6n" Nov 26 13:37:03 crc kubenswrapper[5200]: I1126 13:37:03.154273 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:37:03 crc kubenswrapper[5200]: I1126 13:37:03.155525 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:37:33 crc kubenswrapper[5200]: I1126 13:37:33.154507 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:37:33 crc kubenswrapper[5200]: I1126 13:37:33.155412 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.154786 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.155488 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.155587 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.156429 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e69b8425297c9144715ca780bb68a537c0ee35471dbc6290f21c3cba97e34571"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.156549 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://e69b8425297c9144715ca780bb68a537c0ee35471dbc6290f21c3cba97e34571" gracePeriod=600 Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.494288 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="e69b8425297c9144715ca780bb68a537c0ee35471dbc6290f21c3cba97e34571" exitCode=0 Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.494369 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"e69b8425297c9144715ca780bb68a537c0ee35471dbc6290f21c3cba97e34571"} Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.494741 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"7865dc73495aed9f0598da0640e33dd9acaf0a07a4d2d3ce95f95611a5909a0f"} Nov 26 13:38:03 crc kubenswrapper[5200]: I1126 13:38:03.494768 5200 scope.go:117] "RemoveContainer" containerID="a44e89797844111bb8aded93f9b56ff9fb33b44e5ce546ddc59c7e9f7e94a788" Nov 26 13:39:01 crc kubenswrapper[5200]: I1126 13:39:01.394638 5200 scope.go:117] "RemoveContainer" containerID="339fde4cb529ce6cb50086584205f864510acc9458e186b3e46f48227b91ff5f" Nov 26 13:39:01 crc kubenswrapper[5200]: I1126 13:39:01.419821 5200 scope.go:117] "RemoveContainer" containerID="7b46d05fd6c07fe6e796d894b0a275bdb9be268d58b2387d970e4e2e556a23af" Nov 26 13:39:01 crc kubenswrapper[5200]: I1126 13:39:01.445930 5200 scope.go:117] "RemoveContainer" containerID="8a82977cd8792981b3d03563fc68949e2dffd3090407b4e7342353f48cf8b1a5" Nov 26 13:40:03 crc kubenswrapper[5200]: I1126 13:40:03.154298 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:40:03 crc kubenswrapper[5200]: I1126 13:40:03.155479 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:40:33 crc kubenswrapper[5200]: I1126 13:40:33.154966 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:40:33 crc kubenswrapper[5200]: I1126 13:40:33.157366 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:40:33 crc kubenswrapper[5200]: I1126 13:40:33.575110 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:40:33 crc kubenswrapper[5200]: I1126 13:40:33.576480 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.154445 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.155398 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.155463 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.156497 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7865dc73495aed9f0598da0640e33dd9acaf0a07a4d2d3ce95f95611a5909a0f"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.156579 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://7865dc73495aed9f0598da0640e33dd9acaf0a07a4d2d3ce95f95611a5909a0f" gracePeriod=600 Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.342141 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.962408 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="7865dc73495aed9f0598da0640e33dd9acaf0a07a4d2d3ce95f95611a5909a0f" exitCode=0 Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.962458 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"7865dc73495aed9f0598da0640e33dd9acaf0a07a4d2d3ce95f95611a5909a0f"} Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.963287 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"4f8422efb2c4a1392904b1f896202fb75031d92c9e97e6bd79cfbb8fc5f04b08"} Nov 26 13:41:03 crc kubenswrapper[5200]: I1126 13:41:03.963322 5200 scope.go:117] "RemoveContainer" containerID="e69b8425297c9144715ca780bb68a537c0ee35471dbc6290f21c3cba97e34571" Nov 26 13:43:03 crc kubenswrapper[5200]: I1126 13:43:03.154789 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:43:03 crc kubenswrapper[5200]: I1126 13:43:03.155817 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:43:33 crc kubenswrapper[5200]: I1126 13:43:33.154333 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:43:33 crc kubenswrapper[5200]: I1126 13:43:33.155433 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.220600 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp"] Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.221988 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerName="kube-rbac-proxy" containerID="cri-o://eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.222197 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerName="ovnkube-cluster-manager" containerID="cri-o://05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.412735 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5rzd7"] Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.414289 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.414293 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="nbdb" containerID="cri-o://4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.414413 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="sbdb" containerID="cri-o://20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.414527 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovn-acl-logging" containerID="cri-o://d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.414585 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kube-rbac-proxy-node" containerID="cri-o://2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.414631 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovn-controller" containerID="cri-o://35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.414568 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="northd" containerID="cri-o://64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.449377 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.454334 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovnkube-controller" containerID="cri-o://4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec" gracePeriod=30 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.485460 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc"] Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.486120 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerName="kube-rbac-proxy" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.486143 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerName="kube-rbac-proxy" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.486160 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerName="ovnkube-cluster-manager" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.486167 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerName="ovnkube-cluster-manager" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.486270 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerName="kube-rbac-proxy" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.486286 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerName="ovnkube-cluster-manager" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.490470 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.562134 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovn-control-plane-metrics-cert\") pod \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.562223 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8klvc\" (UniqueName: \"kubernetes.io/projected/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-kube-api-access-8klvc\") pod \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.562270 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovnkube-config\") pod \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.562296 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-env-overrides\") pod \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\" (UID: \"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.562466 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.562733 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxhl\" (UniqueName: \"kubernetes.io/projected/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-kube-api-access-kqxhl\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.562885 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.563076 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.563831 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" (UID: "cfe40ad4-ca2b-4189-98c4-7f81c5bfc635"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.564418 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" (UID: "cfe40ad4-ca2b-4189-98c4-7f81c5bfc635"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.574203 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-kube-api-access-8klvc" (OuterVolumeSpecName: "kube-api-access-8klvc") pod "cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" (UID: "cfe40ad4-ca2b-4189-98c4-7f81c5bfc635"). InnerVolumeSpecName "kube-api-access-8klvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.574214 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" (UID: "cfe40ad4-ca2b-4189-98c4-7f81c5bfc635"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.577241 5200 generic.go:358] "Generic (PLEG): container finished" podID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerID="05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423" exitCode=0 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.577346 5200 generic.go:358] "Generic (PLEG): container finished" podID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" containerID="eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08" exitCode=0 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.577411 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.577324 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" event={"ID":"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635","Type":"ContainerDied","Data":"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423"} Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.577586 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" event={"ID":"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635","Type":"ContainerDied","Data":"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08"} Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.577613 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp" event={"ID":"cfe40ad4-ca2b-4189-98c4-7f81c5bfc635","Type":"ContainerDied","Data":"38aee41b2193e192d892c686e30d2a83e5a97ec448afd4bfa866bcabc4ae29ee"} Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.577649 5200 scope.go:117] "RemoveContainer" containerID="05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.585239 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.585304 5200 generic.go:358] "Generic (PLEG): container finished" podID="1f6a6875-f90c-438b-83a3-0fccba5ff947" containerID="9ea63010c351847083a3776fd49e496c4cadfc2912cb79add611f3746bb1de86" exitCode=2 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.585406 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc24w" event={"ID":"1f6a6875-f90c-438b-83a3-0fccba5ff947","Type":"ContainerDied","Data":"9ea63010c351847083a3776fd49e496c4cadfc2912cb79add611f3746bb1de86"} Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.588193 5200 scope.go:117] "RemoveContainer" containerID="9ea63010c351847083a3776fd49e496c4cadfc2912cb79add611f3746bb1de86" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.592638 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rzd7_a1ef9b40-5d21-4743-a050-2b1fa0d95b7a/ovn-acl-logging/0.log" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.593222 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rzd7_a1ef9b40-5d21-4743-a050-2b1fa0d95b7a/ovn-controller/0.log" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.594051 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553" exitCode=0 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.594074 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda" exitCode=0 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.594084 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208" exitCode=143 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.594095 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d" exitCode=143 Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.594145 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553"} Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.594186 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda"} Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.594201 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208"} Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.594214 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d"} Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.606509 5200 scope.go:117] "RemoveContainer" containerID="eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.626991 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp"] Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.631669 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-j5jsp"] Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.662297 5200 scope.go:117] "RemoveContainer" containerID="05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423" Nov 26 13:43:38 crc kubenswrapper[5200]: E1126 13:43:38.662848 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423\": container with ID starting with 05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423 not found: ID does not exist" containerID="05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.662892 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423"} err="failed to get container status \"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423\": rpc error: code = NotFound desc = could not find container \"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423\": container with ID starting with 05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423 not found: ID does not exist" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.662916 5200 scope.go:117] "RemoveContainer" containerID="eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08" Nov 26 13:43:38 crc kubenswrapper[5200]: E1126 13:43:38.663210 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08\": container with ID starting with eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08 not found: ID does not exist" containerID="eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.663236 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08"} err="failed to get container status \"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08\": rpc error: code = NotFound desc = could not find container \"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08\": container with ID starting with eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08 not found: ID does not exist" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.663254 5200 scope.go:117] "RemoveContainer" containerID="05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.663876 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423"} err="failed to get container status \"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423\": rpc error: code = NotFound desc = could not find container \"05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423\": container with ID starting with 05d7453a4213f5daf32044a53d14393bfb959ed7d4d14fb546a245ec8dd12423 not found: ID does not exist" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.663938 5200 scope.go:117] "RemoveContainer" containerID="eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664286 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664390 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxhl\" (UniqueName: \"kubernetes.io/projected/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-kube-api-access-kqxhl\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664427 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664473 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664516 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664530 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8klvc\" (UniqueName: \"kubernetes.io/projected/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-kube-api-access-8klvc\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664542 5200 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664541 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08"} err="failed to get container status \"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08\": rpc error: code = NotFound desc = could not find container \"eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08\": container with ID starting with eb229b8a91570a92239f962521cf5c828cf4c5ffaf347672073786973a88fa08 not found: ID does not exist" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.664551 5200 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.665281 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.666042 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.671820 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.682883 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxhl\" (UniqueName: \"kubernetes.io/projected/bf6a84fc-d32d-41ee-a924-3cf23ee0c79a-kube-api-access-kqxhl\") pod \"ovnkube-control-plane-97c9b6c48-mhfkc\" (UID: \"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.783940 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rzd7_a1ef9b40-5d21-4743-a050-2b1fa0d95b7a/ovn-acl-logging/0.log" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.784714 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rzd7_a1ef9b40-5d21-4743-a050-2b1fa0d95b7a/ovn-controller/0.log" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.785569 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.853427 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.858135 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lvd56"] Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859407 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859453 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859487 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovn-controller" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859506 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovn-controller" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859552 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kubecfg-setup" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859568 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kubecfg-setup" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859607 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="nbdb" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859624 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="nbdb" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859648 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovn-acl-logging" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.859665 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovn-acl-logging" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860543 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kube-rbac-proxy-node" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860572 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kube-rbac-proxy-node" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860593 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="sbdb" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860608 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="sbdb" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860629 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovnkube-controller" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860644 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovnkube-controller" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860665 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="northd" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860681 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="northd" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860958 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="sbdb" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.860987 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kube-rbac-proxy-node" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.861006 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovnkube-controller" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.861033 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovn-controller" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.861055 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="northd" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.861076 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="nbdb" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.861099 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="ovn-acl-logging" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.861116 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867172 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88zw\" (UniqueName: \"kubernetes.io/projected/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-kube-api-access-n88zw\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867239 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-env-overrides\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867262 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-netns\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867282 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-systemd-units\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867361 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867405 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867502 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-var-lib-openvswitch\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867612 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-config\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867675 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867742 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867766 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-ovn-kubernetes\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867819 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-bin\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867950 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-etc-openvswitch\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.867989 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-openvswitch\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868039 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-log-socket\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868116 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovn-node-metrics-cert\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868170 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-ovn\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868267 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-node-log\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868362 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-slash\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868447 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868446 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-systemd\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868493 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-netd\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868513 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-kubelet\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868563 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-script-lib\") pod \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\" (UID: \"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a\") " Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868553 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868667 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868947 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868973 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.868994 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869034 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869081 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869093 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869116 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-log-socket" (OuterVolumeSpecName: "log-socket") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869182 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869222 5200 reconciler_common.go:299] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869229 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-node-log" (OuterVolumeSpecName: "node-log") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869258 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869237 5200 reconciler_common.go:299] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869281 5200 reconciler_common.go:299] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869289 5200 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869287 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-slash" (OuterVolumeSpecName: "host-slash") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869298 5200 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869356 5200 reconciler_common.go:299] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869378 5200 reconciler_common.go:299] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869397 5200 reconciler_common.go:299] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869440 5200 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869460 5200 reconciler_common.go:299] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869481 5200 reconciler_common.go:299] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869503 5200 reconciler_common.go:299] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.869521 5200 reconciler_common.go:299] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-log-socket\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.873980 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-kube-api-access-n88zw" (OuterVolumeSpecName: "kube-api-access-n88zw") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "kube-api-access-n88zw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.876649 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.877955 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.884609 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" (UID: "a1ef9b40-5d21-4743-a050-2b1fa0d95b7a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:38 crc kubenswrapper[5200]: W1126 13:43:38.891647 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf6a84fc_d32d_41ee_a924_3cf23ee0c79a.slice/crio-220dd91f146f176eec9f94b4e33f0ee6297f4f718dc2c7c1093580684166d5bb WatchSource:0}: Error finding container 220dd91f146f176eec9f94b4e33f0ee6297f4f718dc2c7c1093580684166d5bb: Status 404 returned error can't find the container with id 220dd91f146f176eec9f94b4e33f0ee6297f4f718dc2c7c1093580684166d5bb Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970182 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovn-node-metrics-cert\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970239 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970260 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-env-overrides\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970277 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-kubelet\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970304 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-cni-netd\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970324 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-cni-bin\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970343 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovnkube-script-lib\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970363 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-run-netns\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970536 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-slash\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970582 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8fz\" (UniqueName: \"kubernetes.io/projected/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-kube-api-access-9v8fz\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970608 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970658 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-node-log\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970713 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-etc-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970778 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-systemd\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970805 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-var-lib-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970830 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovnkube-config\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970852 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-systemd-units\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970879 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.970932 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-log-socket\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.971013 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-ovn\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.971128 5200 reconciler_common.go:299] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-node-log\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.971145 5200 reconciler_common.go:299] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-slash\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.971158 5200 reconciler_common.go:299] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.971171 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n88zw\" (UniqueName: \"kubernetes.io/projected/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-kube-api-access-n88zw\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.971185 5200 reconciler_common.go:299] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.971199 5200 reconciler_common.go:299] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.971271 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:38 crc kubenswrapper[5200]: I1126 13:43:38.984737 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe40ad4-ca2b-4189-98c4-7f81c5bfc635" path="/var/lib/kubelet/pods/cfe40ad4-ca2b-4189-98c4-7f81c5bfc635/volumes" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073107 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-cni-netd\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073164 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-cni-bin\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073189 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovnkube-script-lib\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073214 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-run-netns\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073237 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-slash\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073270 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8fz\" (UniqueName: \"kubernetes.io/projected/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-kube-api-access-9v8fz\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073290 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073307 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-node-log\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073359 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-etc-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073368 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-slash\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073416 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-systemd\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073372 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-cni-bin\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073379 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-systemd\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073469 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-var-lib-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073468 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073487 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovnkube-config\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073586 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-systemd-units\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073637 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073665 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-log-socket\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073718 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-ovn\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073776 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovn-node-metrics-cert\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073873 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073906 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-env-overrides\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.073945 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-kubelet\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074113 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-kubelet\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074154 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-systemd-units\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074178 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074190 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovnkube-config\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074203 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-log-socket\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074239 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-ovn\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074303 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovnkube-script-lib\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074384 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-etc-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074458 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-run-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.074834 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-var-lib-openvswitch\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.075217 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-node-log\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.075268 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-run-netns\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.075339 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-env-overrides\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.075392 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-host-cni-netd\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.079183 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-ovn-node-metrics-cert\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.090958 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8fz\" (UniqueName: \"kubernetes.io/projected/d6c5ac16-3014-4d82-8ff7-acb49532b0d0-kube-api-access-9v8fz\") pod \"ovnkube-node-lvd56\" (UID: \"d6c5ac16-3014-4d82-8ff7-acb49532b0d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.214782 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.605135 5200 generic.go:358] "Generic (PLEG): container finished" podID="d6c5ac16-3014-4d82-8ff7-acb49532b0d0" containerID="b9f50c20b0b8e681c6e1c1757e77a82ada64a8f79845fd3b931958b503c90298" exitCode=0 Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.605300 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerDied","Data":"b9f50c20b0b8e681c6e1c1757e77a82ada64a8f79845fd3b931958b503c90298"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.605425 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"b2787cac5e180a162e4201aebcf69c280a63f744ee68b624964a09eadf58588b"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.608957 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" event={"ID":"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a","Type":"ContainerStarted","Data":"7030e094a1a25952e31639f3ed65ad6e3622713d295d8ea30b8c0ff35aaff7a0"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.609005 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" event={"ID":"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a","Type":"ContainerStarted","Data":"03ce70d4858460976d0f376d882300f95f7e768696883cbd8dcd8b8af2e94f5b"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.609027 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" event={"ID":"bf6a84fc-d32d-41ee-a924-3cf23ee0c79a","Type":"ContainerStarted","Data":"220dd91f146f176eec9f94b4e33f0ee6297f4f718dc2c7c1093580684166d5bb"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.617388 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.617571 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc24w" event={"ID":"1f6a6875-f90c-438b-83a3-0fccba5ff947","Type":"ContainerStarted","Data":"59e303d920e0d4bcdc05298fea1f3bc6ac41f545ebbc434e353768a8a11b6f62"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.622564 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rzd7_a1ef9b40-5d21-4743-a050-2b1fa0d95b7a/ovn-acl-logging/0.log" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.622978 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5rzd7_a1ef9b40-5d21-4743-a050-2b1fa0d95b7a/ovn-controller/0.log" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623286 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec" exitCode=0 Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623307 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f" exitCode=0 Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623314 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516" exitCode=0 Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623321 5200 generic.go:358] "Generic (PLEG): container finished" podID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" containerID="64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63" exitCode=0 Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623363 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623393 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623403 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623413 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623423 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" event={"ID":"a1ef9b40-5d21-4743-a050-2b1fa0d95b7a","Type":"ContainerDied","Data":"02d3a2db83c2a4dd09e74572f0d6e829dd4fc10f4e8dc1da71dea195f6a6bdb8"} Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623442 5200 scope.go:117] "RemoveContainer" containerID="4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.623733 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5rzd7" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.647853 5200 scope.go:117] "RemoveContainer" containerID="20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.673782 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-mhfkc" podStartSLOduration=1.673759303 podStartE2EDuration="1.673759303s" podCreationTimestamp="2025-11-26 13:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:43:39.668429109 +0000 UTC m=+788.347630937" watchObservedRunningTime="2025-11-26 13:43:39.673759303 +0000 UTC m=+788.352961121" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.694381 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5rzd7"] Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.696022 5200 scope.go:117] "RemoveContainer" containerID="4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.699567 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5rzd7"] Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.718505 5200 scope.go:117] "RemoveContainer" containerID="64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.736077 5200 scope.go:117] "RemoveContainer" containerID="8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.751530 5200 scope.go:117] "RemoveContainer" containerID="2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.768205 5200 scope.go:117] "RemoveContainer" containerID="d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.792505 5200 scope.go:117] "RemoveContainer" containerID="35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.813486 5200 scope.go:117] "RemoveContainer" containerID="b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.853375 5200 scope.go:117] "RemoveContainer" containerID="4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.854017 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": container with ID starting with 4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec not found: ID does not exist" containerID="4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.854057 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec"} err="failed to get container status \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": rpc error: code = NotFound desc = could not find container \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": container with ID starting with 4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.854084 5200 scope.go:117] "RemoveContainer" containerID="20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.854538 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": container with ID starting with 20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f not found: ID does not exist" containerID="20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.854560 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f"} err="failed to get container status \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": rpc error: code = NotFound desc = could not find container \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": container with ID starting with 20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.854576 5200 scope.go:117] "RemoveContainer" containerID="4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.855102 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": container with ID starting with 4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516 not found: ID does not exist" containerID="4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.855133 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516"} err="failed to get container status \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": rpc error: code = NotFound desc = could not find container \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": container with ID starting with 4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.855148 5200 scope.go:117] "RemoveContainer" containerID="64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.855424 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": container with ID starting with 64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63 not found: ID does not exist" containerID="64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.855453 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63"} err="failed to get container status \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": rpc error: code = NotFound desc = could not find container \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": container with ID starting with 64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.855468 5200 scope.go:117] "RemoveContainer" containerID="8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.855792 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": container with ID starting with 8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553 not found: ID does not exist" containerID="8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.855815 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553"} err="failed to get container status \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": rpc error: code = NotFound desc = could not find container \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": container with ID starting with 8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.855828 5200 scope.go:117] "RemoveContainer" containerID="2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.856265 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": container with ID starting with 2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda not found: ID does not exist" containerID="2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.856290 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda"} err="failed to get container status \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": rpc error: code = NotFound desc = could not find container \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": container with ID starting with 2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.856304 5200 scope.go:117] "RemoveContainer" containerID="d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.856775 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": container with ID starting with d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208 not found: ID does not exist" containerID="d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.856846 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208"} err="failed to get container status \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": rpc error: code = NotFound desc = could not find container \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": container with ID starting with d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.856891 5200 scope.go:117] "RemoveContainer" containerID="35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.857509 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": container with ID starting with 35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d not found: ID does not exist" containerID="35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.857543 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d"} err="failed to get container status \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": rpc error: code = NotFound desc = could not find container \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": container with ID starting with 35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.857561 5200 scope.go:117] "RemoveContainer" containerID="b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261" Nov 26 13:43:39 crc kubenswrapper[5200]: E1126 13:43:39.857916 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": container with ID starting with b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261 not found: ID does not exist" containerID="b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.857941 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261"} err="failed to get container status \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": rpc error: code = NotFound desc = could not find container \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": container with ID starting with b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.857955 5200 scope.go:117] "RemoveContainer" containerID="4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.858270 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec"} err="failed to get container status \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": rpc error: code = NotFound desc = could not find container \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": container with ID starting with 4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.858302 5200 scope.go:117] "RemoveContainer" containerID="20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.858667 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f"} err="failed to get container status \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": rpc error: code = NotFound desc = could not find container \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": container with ID starting with 20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.858752 5200 scope.go:117] "RemoveContainer" containerID="4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.859179 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516"} err="failed to get container status \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": rpc error: code = NotFound desc = could not find container \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": container with ID starting with 4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.859203 5200 scope.go:117] "RemoveContainer" containerID="64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.859565 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63"} err="failed to get container status \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": rpc error: code = NotFound desc = could not find container \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": container with ID starting with 64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.859585 5200 scope.go:117] "RemoveContainer" containerID="8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.859870 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553"} err="failed to get container status \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": rpc error: code = NotFound desc = could not find container \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": container with ID starting with 8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.859890 5200 scope.go:117] "RemoveContainer" containerID="2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.860158 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda"} err="failed to get container status \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": rpc error: code = NotFound desc = could not find container \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": container with ID starting with 2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.860178 5200 scope.go:117] "RemoveContainer" containerID="d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.860567 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208"} err="failed to get container status \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": rpc error: code = NotFound desc = could not find container \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": container with ID starting with d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.860621 5200 scope.go:117] "RemoveContainer" containerID="35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.861156 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d"} err="failed to get container status \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": rpc error: code = NotFound desc = could not find container \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": container with ID starting with 35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.861186 5200 scope.go:117] "RemoveContainer" containerID="b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.861560 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261"} err="failed to get container status \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": rpc error: code = NotFound desc = could not find container \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": container with ID starting with b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.861592 5200 scope.go:117] "RemoveContainer" containerID="4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.862147 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec"} err="failed to get container status \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": rpc error: code = NotFound desc = could not find container \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": container with ID starting with 4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.862171 5200 scope.go:117] "RemoveContainer" containerID="20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.862500 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f"} err="failed to get container status \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": rpc error: code = NotFound desc = could not find container \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": container with ID starting with 20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.862527 5200 scope.go:117] "RemoveContainer" containerID="4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.862970 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516"} err="failed to get container status \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": rpc error: code = NotFound desc = could not find container \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": container with ID starting with 4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.862993 5200 scope.go:117] "RemoveContainer" containerID="64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.863291 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63"} err="failed to get container status \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": rpc error: code = NotFound desc = could not find container \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": container with ID starting with 64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.863316 5200 scope.go:117] "RemoveContainer" containerID="8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.863580 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553"} err="failed to get container status \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": rpc error: code = NotFound desc = could not find container \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": container with ID starting with 8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.863612 5200 scope.go:117] "RemoveContainer" containerID="2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.863904 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda"} err="failed to get container status \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": rpc error: code = NotFound desc = could not find container \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": container with ID starting with 2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.863927 5200 scope.go:117] "RemoveContainer" containerID="d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.864194 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208"} err="failed to get container status \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": rpc error: code = NotFound desc = could not find container \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": container with ID starting with d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.864222 5200 scope.go:117] "RemoveContainer" containerID="35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.864452 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d"} err="failed to get container status \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": rpc error: code = NotFound desc = could not find container \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": container with ID starting with 35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.864481 5200 scope.go:117] "RemoveContainer" containerID="b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.864723 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261"} err="failed to get container status \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": rpc error: code = NotFound desc = could not find container \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": container with ID starting with b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.864745 5200 scope.go:117] "RemoveContainer" containerID="4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.864966 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec"} err="failed to get container status \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": rpc error: code = NotFound desc = could not find container \"4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec\": container with ID starting with 4feb85039cf9d68d874cb71d4aeb2e57cefc262edd48270855b9759be1afc2ec not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.864985 5200 scope.go:117] "RemoveContainer" containerID="20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.865148 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f"} err="failed to get container status \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": rpc error: code = NotFound desc = could not find container \"20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f\": container with ID starting with 20e03bde355a5c097307030f0b0e3656c329f6ee76d3dba4b30d10e4cf0a881f not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.865167 5200 scope.go:117] "RemoveContainer" containerID="4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.865347 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516"} err="failed to get container status \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": rpc error: code = NotFound desc = could not find container \"4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516\": container with ID starting with 4bc778bf8f90fdbf47b66b514fd93ca574095e113b94b466de8d90a96268d516 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.865367 5200 scope.go:117] "RemoveContainer" containerID="64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.865586 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63"} err="failed to get container status \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": rpc error: code = NotFound desc = could not find container \"64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63\": container with ID starting with 64cf43cc9dc90872bfc6a8ae32ae1f9071d894e7e2e3614fa6ef31f80b1aec63 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.865613 5200 scope.go:117] "RemoveContainer" containerID="8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.865832 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553"} err="failed to get container status \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": rpc error: code = NotFound desc = could not find container \"8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553\": container with ID starting with 8b27b792a9327aec79d12bbaf7c0702677f8d5955a3673cde3ce4fa4c10e6553 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.865853 5200 scope.go:117] "RemoveContainer" containerID="2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.866045 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda"} err="failed to get container status \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": rpc error: code = NotFound desc = could not find container \"2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda\": container with ID starting with 2f0cc3fde2545adabf7ed8f44b83c2d15bc953d32e3d1e2c557a54c50c878dda not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.866068 5200 scope.go:117] "RemoveContainer" containerID="d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.866385 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208"} err="failed to get container status \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": rpc error: code = NotFound desc = could not find container \"d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208\": container with ID starting with d2e4e3ef42d7bfd5a341a61774ff13a84005cc7c088d4da04ad4f58554a5d208 not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.866409 5200 scope.go:117] "RemoveContainer" containerID="35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.866671 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d"} err="failed to get container status \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": rpc error: code = NotFound desc = could not find container \"35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d\": container with ID starting with 35848a109e3f953012738d70a9dddb61026083067d48534ada50e016eea2ae9d not found: ID does not exist" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.866715 5200 scope.go:117] "RemoveContainer" containerID="b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261" Nov 26 13:43:39 crc kubenswrapper[5200]: I1126 13:43:39.866962 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261"} err="failed to get container status \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": rpc error: code = NotFound desc = could not find container \"b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261\": container with ID starting with b7c7f6706e680333a281e7fa1f17ac3396bb4e9954af6e1bcfc1bce601833261 not found: ID does not exist" Nov 26 13:43:40 crc kubenswrapper[5200]: I1126 13:43:40.635146 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"633d587436c9cb9f1463a55b9a995a684f0287b8a30fae111a892cc38ce5e619"} Nov 26 13:43:40 crc kubenswrapper[5200]: I1126 13:43:40.635748 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"2a1cb01e9e32bbf26a91ea594e79ceb64c613dbe45d59c33db0c981608c4708e"} Nov 26 13:43:40 crc kubenswrapper[5200]: I1126 13:43:40.635768 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"0eceaaafc93e132f95802111945eff09bb500ec44cac62bbc995f76ae7101c0d"} Nov 26 13:43:40 crc kubenswrapper[5200]: I1126 13:43:40.635791 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"7fef8668851ae53e044277b527cbfa3030ff43b77aae267ce4716bd8f86cd684"} Nov 26 13:43:40 crc kubenswrapper[5200]: I1126 13:43:40.635805 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"8008386e4ac86aa93c2713c28a2ba5819c546c53bf1b479d25de1bf13e21eed9"} Nov 26 13:43:40 crc kubenswrapper[5200]: I1126 13:43:40.635820 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"bac64186e7d5638ffd74d7ae594412254d5e8d11cfcb4e23888ee2ff7d6b85e0"} Nov 26 13:43:40 crc kubenswrapper[5200]: I1126 13:43:40.982194 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ef9b40-5d21-4743-a050-2b1fa0d95b7a" path="/var/lib/kubelet/pods/a1ef9b40-5d21-4743-a050-2b1fa0d95b7a/volumes" Nov 26 13:43:43 crc kubenswrapper[5200]: I1126 13:43:43.671401 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"c354b51d34401bdb79b0c6c56114e19ff2d5ec98e4caabab8eab5e2ac4b2e1d7"} Nov 26 13:43:45 crc kubenswrapper[5200]: I1126 13:43:45.691833 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" event={"ID":"d6c5ac16-3014-4d82-8ff7-acb49532b0d0","Type":"ContainerStarted","Data":"ba59bc0f96f61f8acc0b2df788058aa0998b7d0a5e21cbbc3e7b175906483be4"} Nov 26 13:43:45 crc kubenswrapper[5200]: I1126 13:43:45.693015 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:45 crc kubenswrapper[5200]: I1126 13:43:45.693045 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:45 crc kubenswrapper[5200]: I1126 13:43:45.693059 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:45 crc kubenswrapper[5200]: I1126 13:43:45.737910 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" podStartSLOduration=7.737891268 podStartE2EDuration="7.737891268s" podCreationTimestamp="2025-11-26 13:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:43:45.73355696 +0000 UTC m=+794.412758788" watchObservedRunningTime="2025-11-26 13:43:45.737891268 +0000 UTC m=+794.417093086" Nov 26 13:43:45 crc kubenswrapper[5200]: I1126 13:43:45.742025 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:45 crc kubenswrapper[5200]: I1126 13:43:45.742135 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.834250 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ppxwn"] Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.952301 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ppxwn"] Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.952544 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.956127 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"crc-storage\"" Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.956329 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"crc-storage\"/\"crc-storage-dockercfg-c4cvj\"" Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.956395 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"openshift-service-ca.crt\"" Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.956395 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"kube-root-ca.crt\"" Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.989629 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdp7r\" (UniqueName: \"kubernetes.io/projected/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-kube-api-access-gdp7r\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.989745 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-node-mnt\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:50 crc kubenswrapper[5200]: I1126 13:43:50.989842 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-crc-storage\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.091982 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdp7r\" (UniqueName: \"kubernetes.io/projected/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-kube-api-access-gdp7r\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.092079 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-node-mnt\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.092150 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-crc-storage\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.093299 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-node-mnt\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.093797 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-crc-storage\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.121400 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdp7r\" (UniqueName: \"kubernetes.io/projected/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-kube-api-access-gdp7r\") pod \"crc-storage-crc-ppxwn\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.280428 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.567717 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ppxwn"] Nov 26 13:43:51 crc kubenswrapper[5200]: I1126 13:43:51.738169 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ppxwn" event={"ID":"f17fa9cf-7538-4315-a9c8-5542da7f1eaf","Type":"ContainerStarted","Data":"b4fa86219dbc3039bfd39518f08295b7aa08479842e5856688d2e681ea7ae4fe"} Nov 26 13:43:53 crc kubenswrapper[5200]: I1126 13:43:53.756723 5200 generic.go:358] "Generic (PLEG): container finished" podID="f17fa9cf-7538-4315-a9c8-5542da7f1eaf" containerID="a7b87f4f526bf4eed31c217753c0fbdb7761d4e01bcfff502767efb319ac15ff" exitCode=0 Nov 26 13:43:53 crc kubenswrapper[5200]: I1126 13:43:53.756979 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ppxwn" event={"ID":"f17fa9cf-7538-4315-a9c8-5542da7f1eaf","Type":"ContainerDied","Data":"a7b87f4f526bf4eed31c217753c0fbdb7761d4e01bcfff502767efb319ac15ff"} Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.035739 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.157479 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdp7r\" (UniqueName: \"kubernetes.io/projected/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-kube-api-access-gdp7r\") pod \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.157742 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-crc-storage\") pod \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.157808 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-node-mnt\") pod \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\" (UID: \"f17fa9cf-7538-4315-a9c8-5542da7f1eaf\") " Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.158212 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f17fa9cf-7538-4315-a9c8-5542da7f1eaf" (UID: "f17fa9cf-7538-4315-a9c8-5542da7f1eaf"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.165313 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-kube-api-access-gdp7r" (OuterVolumeSpecName: "kube-api-access-gdp7r") pod "f17fa9cf-7538-4315-a9c8-5542da7f1eaf" (UID: "f17fa9cf-7538-4315-a9c8-5542da7f1eaf"). InnerVolumeSpecName "kube-api-access-gdp7r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.181963 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f17fa9cf-7538-4315-a9c8-5542da7f1eaf" (UID: "f17fa9cf-7538-4315-a9c8-5542da7f1eaf"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.259259 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gdp7r\" (UniqueName: \"kubernetes.io/projected/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-kube-api-access-gdp7r\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.259296 5200 reconciler_common.go:299] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.259307 5200 reconciler_common.go:299] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f17fa9cf-7538-4315-a9c8-5542da7f1eaf-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.778399 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ppxwn" event={"ID":"f17fa9cf-7538-4315-a9c8-5542da7f1eaf","Type":"ContainerDied","Data":"b4fa86219dbc3039bfd39518f08295b7aa08479842e5856688d2e681ea7ae4fe"} Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.779041 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4fa86219dbc3039bfd39518f08295b7aa08479842e5856688d2e681ea7ae4fe" Nov 26 13:43:55 crc kubenswrapper[5200]: I1126 13:43:55.778466 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ppxwn" Nov 26 13:44:02 crc kubenswrapper[5200]: I1126 13:44:02.653884 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-jdphn"] Nov 26 13:44:02 crc kubenswrapper[5200]: I1126 13:44:02.655212 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f17fa9cf-7538-4315-a9c8-5542da7f1eaf" containerName="storage" Nov 26 13:44:02 crc kubenswrapper[5200]: I1126 13:44:02.655233 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17fa9cf-7538-4315-a9c8-5542da7f1eaf" containerName="storage" Nov 26 13:44:02 crc kubenswrapper[5200]: I1126 13:44:02.655363 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f17fa9cf-7538-4315-a9c8-5542da7f1eaf" containerName="storage" Nov 26 13:44:02 crc kubenswrapper[5200]: I1126 13:44:02.922783 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-jdphn"] Nov 26 13:44:02 crc kubenswrapper[5200]: I1126 13:44:02.922959 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.098611 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-registry-certificates\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.098711 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.098895 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-trusted-ca\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.098974 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-registry-tls\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.099049 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.099166 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.099342 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-bound-sa-token\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.099523 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hdhx\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-kube-api-access-4hdhx\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.131934 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.154676 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.154845 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.154916 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.155763 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f8422efb2c4a1392904b1f896202fb75031d92c9e97e6bd79cfbb8fc5f04b08"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.155853 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://4f8422efb2c4a1392904b1f896202fb75031d92c9e97e6bd79cfbb8fc5f04b08" gracePeriod=600 Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.201391 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.201487 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.201548 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-bound-sa-token\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.201584 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hdhx\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-kube-api-access-4hdhx\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.201615 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-registry-certificates\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.201672 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-trusted-ca\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.201736 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-registry-tls\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.202626 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.203589 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-registry-certificates\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.203795 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-trusted-ca\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.210618 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.210671 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-registry-tls\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.222371 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-bound-sa-token\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.227079 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hdhx\" (UniqueName: \"kubernetes.io/projected/8fb0768d-a3a4-4f1b-b88a-e388a4ee9223-kube-api-access-4hdhx\") pod \"image-registry-5d9d95bf5b-jdphn\" (UID: \"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.241771 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.464640 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-jdphn"] Nov 26 13:44:03 crc kubenswrapper[5200]: W1126 13:44:03.470791 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb0768d_a3a4_4f1b_b88a_e388a4ee9223.slice/crio-044bdfd8edc92caf19232329cd881df4d1c8e108e14a7584fd53c122ccc4ae8a WatchSource:0}: Error finding container 044bdfd8edc92caf19232329cd881df4d1c8e108e14a7584fd53c122ccc4ae8a: Status 404 returned error can't find the container with id 044bdfd8edc92caf19232329cd881df4d1c8e108e14a7584fd53c122ccc4ae8a Nov 26 13:44:03 crc kubenswrapper[5200]: I1126 13:44:03.840658 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" event={"ID":"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223","Type":"ContainerStarted","Data":"044bdfd8edc92caf19232329cd881df4d1c8e108e14a7584fd53c122ccc4ae8a"} Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.232607 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd"] Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.442994 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd"] Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.443257 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.446510 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.628281 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-util\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.628395 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-bundle\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.628560 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hms9p\" (UniqueName: \"kubernetes.io/projected/fc65da8b-cbfb-4cc8-8a63-a749d7306185-kube-api-access-hms9p\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.731068 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-util\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.731226 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-bundle\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.731350 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hms9p\" (UniqueName: \"kubernetes.io/projected/fc65da8b-cbfb-4cc8-8a63-a749d7306185-kube-api-access-hms9p\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.733427 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-bundle\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.733411 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-util\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.774226 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hms9p\" (UniqueName: \"kubernetes.io/projected/fc65da8b-cbfb-4cc8-8a63-a749d7306185-kube-api-access-hms9p\") pod \"7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.855532 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" event={"ID":"8fb0768d-a3a4-4f1b-b88a-e388a4ee9223","Type":"ContainerStarted","Data":"100a29c1511941bb1df8798737aaedc0f942c21e54d44a677249041a9a68509b"} Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.855974 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.862725 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="4f8422efb2c4a1392904b1f896202fb75031d92c9e97e6bd79cfbb8fc5f04b08" exitCode=0 Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.862768 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"4f8422efb2c4a1392904b1f896202fb75031d92c9e97e6bd79cfbb8fc5f04b08"} Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.862939 5200 scope.go:117] "RemoveContainer" containerID="7865dc73495aed9f0598da0640e33dd9acaf0a07a4d2d3ce95f95611a5909a0f" Nov 26 13:44:04 crc kubenswrapper[5200]: I1126 13:44:04.888545 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" podStartSLOduration=2.888512263 podStartE2EDuration="2.888512263s" podCreationTimestamp="2025-11-26 13:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:44:04.888203825 +0000 UTC m=+813.567405683" watchObservedRunningTime="2025-11-26 13:44:04.888512263 +0000 UTC m=+813.567714121" Nov 26 13:44:05 crc kubenswrapper[5200]: I1126 13:44:05.071978 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:05 crc kubenswrapper[5200]: I1126 13:44:05.397484 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd"] Nov 26 13:44:05 crc kubenswrapper[5200]: W1126 13:44:05.409577 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc65da8b_cbfb_4cc8_8a63_a749d7306185.slice/crio-8f7d227a18a1d5cc02b28ecadd256173bdbad08674f008ee0b006bf9db6705f1 WatchSource:0}: Error finding container 8f7d227a18a1d5cc02b28ecadd256173bdbad08674f008ee0b006bf9db6705f1: Status 404 returned error can't find the container with id 8f7d227a18a1d5cc02b28ecadd256173bdbad08674f008ee0b006bf9db6705f1 Nov 26 13:44:05 crc kubenswrapper[5200]: I1126 13:44:05.875743 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" event={"ID":"fc65da8b-cbfb-4cc8-8a63-a749d7306185","Type":"ContainerStarted","Data":"8f7d227a18a1d5cc02b28ecadd256173bdbad08674f008ee0b006bf9db6705f1"} Nov 26 13:44:05 crc kubenswrapper[5200]: I1126 13:44:05.879982 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"499ed0801b3e77071f6b5d1fa3f9f5b7ce1e53c1e78e3fecf97c942228f0d688"} Nov 26 13:44:06 crc kubenswrapper[5200]: I1126 13:44:06.568222 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b75mp"] Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.854176 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b75mp"] Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.854607 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.884327 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfzkl\" (UniqueName: \"kubernetes.io/projected/4659f491-d71a-4c0f-ab82-9119d4facb07-kube-api-access-mfzkl\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.884811 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-utilities\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.885038 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-catalog-content\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.987187 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfzkl\" (UniqueName: \"kubernetes.io/projected/4659f491-d71a-4c0f-ab82-9119d4facb07-kube-api-access-mfzkl\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.987272 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-utilities\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.987330 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-catalog-content\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.988128 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-catalog-content\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:07 crc kubenswrapper[5200]: I1126 13:44:07.988234 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-utilities\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:08 crc kubenswrapper[5200]: I1126 13:44:08.011865 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfzkl\" (UniqueName: \"kubernetes.io/projected/4659f491-d71a-4c0f-ab82-9119d4facb07-kube-api-access-mfzkl\") pod \"redhat-operators-b75mp\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:08 crc kubenswrapper[5200]: I1126 13:44:08.172811 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:08 crc kubenswrapper[5200]: I1126 13:44:08.449944 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b75mp"] Nov 26 13:44:08 crc kubenswrapper[5200]: W1126 13:44:08.456982 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4659f491_d71a_4c0f_ab82_9119d4facb07.slice/crio-62458a6aabfa96c39da58e9dea762016f4087490ef9c192f3602e8123c5ac6d0 WatchSource:0}: Error finding container 62458a6aabfa96c39da58e9dea762016f4087490ef9c192f3602e8123c5ac6d0: Status 404 returned error can't find the container with id 62458a6aabfa96c39da58e9dea762016f4087490ef9c192f3602e8123c5ac6d0 Nov 26 13:44:08 crc kubenswrapper[5200]: I1126 13:44:08.907222 5200 generic.go:358] "Generic (PLEG): container finished" podID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerID="95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc" exitCode=0 Nov 26 13:44:08 crc kubenswrapper[5200]: I1126 13:44:08.907548 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b75mp" event={"ID":"4659f491-d71a-4c0f-ab82-9119d4facb07","Type":"ContainerDied","Data":"95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc"} Nov 26 13:44:08 crc kubenswrapper[5200]: I1126 13:44:08.908080 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b75mp" event={"ID":"4659f491-d71a-4c0f-ab82-9119d4facb07","Type":"ContainerStarted","Data":"62458a6aabfa96c39da58e9dea762016f4087490ef9c192f3602e8123c5ac6d0"} Nov 26 13:44:08 crc kubenswrapper[5200]: I1126 13:44:08.916936 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" event={"ID":"fc65da8b-cbfb-4cc8-8a63-a749d7306185","Type":"ContainerStarted","Data":"7ee7f466d7ff0dde2a35a8dc2af9ae2e707d4f7765aad82029835872f7ae89ea"} Nov 26 13:44:09 crc kubenswrapper[5200]: I1126 13:44:09.928082 5200 generic.go:358] "Generic (PLEG): container finished" podID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerID="7ee7f466d7ff0dde2a35a8dc2af9ae2e707d4f7765aad82029835872f7ae89ea" exitCode=0 Nov 26 13:44:09 crc kubenswrapper[5200]: I1126 13:44:09.928214 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" event={"ID":"fc65da8b-cbfb-4cc8-8a63-a749d7306185","Type":"ContainerDied","Data":"7ee7f466d7ff0dde2a35a8dc2af9ae2e707d4f7765aad82029835872f7ae89ea"} Nov 26 13:44:10 crc kubenswrapper[5200]: I1126 13:44:10.943199 5200 generic.go:358] "Generic (PLEG): container finished" podID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerID="cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2" exitCode=0 Nov 26 13:44:10 crc kubenswrapper[5200]: I1126 13:44:10.943293 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b75mp" event={"ID":"4659f491-d71a-4c0f-ab82-9119d4facb07","Type":"ContainerDied","Data":"cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2"} Nov 26 13:44:11 crc kubenswrapper[5200]: I1126 13:44:11.957261 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b75mp" event={"ID":"4659f491-d71a-4c0f-ab82-9119d4facb07","Type":"ContainerStarted","Data":"66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c"} Nov 26 13:44:11 crc kubenswrapper[5200]: I1126 13:44:11.989758 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b75mp" podStartSLOduration=4.72968548 podStartE2EDuration="5.989733432s" podCreationTimestamp="2025-11-26 13:44:06 +0000 UTC" firstStartedPulling="2025-11-26 13:44:08.909150042 +0000 UTC m=+817.588351900" lastFinishedPulling="2025-11-26 13:44:10.169197994 +0000 UTC m=+818.848399852" observedRunningTime="2025-11-26 13:44:11.983121212 +0000 UTC m=+820.662323090" watchObservedRunningTime="2025-11-26 13:44:11.989733432 +0000 UTC m=+820.668935260" Nov 26 13:44:12 crc kubenswrapper[5200]: I1126 13:44:12.970779 5200 generic.go:358] "Generic (PLEG): container finished" podID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerID="00696994c543535afc85bbb8fdcba6d435b2604f653c5104ead94c54de592b2c" exitCode=0 Nov 26 13:44:12 crc kubenswrapper[5200]: I1126 13:44:12.984328 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" event={"ID":"fc65da8b-cbfb-4cc8-8a63-a749d7306185","Type":"ContainerDied","Data":"00696994c543535afc85bbb8fdcba6d435b2604f653c5104ead94c54de592b2c"} Nov 26 13:44:13 crc kubenswrapper[5200]: I1126 13:44:13.984358 5200 generic.go:358] "Generic (PLEG): container finished" podID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerID="9d8df21b1e5bf90ba682987ce9d51df802d05d36665df75904fa01f3e16450e3" exitCode=0 Nov 26 13:44:13 crc kubenswrapper[5200]: I1126 13:44:13.984441 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" event={"ID":"fc65da8b-cbfb-4cc8-8a63-a749d7306185","Type":"ContainerDied","Data":"9d8df21b1e5bf90ba682987ce9d51df802d05d36665df75904fa01f3e16450e3"} Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.332080 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.432087 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hms9p\" (UniqueName: \"kubernetes.io/projected/fc65da8b-cbfb-4cc8-8a63-a749d7306185-kube-api-access-hms9p\") pod \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.433846 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-bundle\") pod \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.435999 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-bundle" (OuterVolumeSpecName: "bundle") pod "fc65da8b-cbfb-4cc8-8a63-a749d7306185" (UID: "fc65da8b-cbfb-4cc8-8a63-a749d7306185"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.436877 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-util\") pod \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\" (UID: \"fc65da8b-cbfb-4cc8-8a63-a749d7306185\") " Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.437316 5200 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.445026 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc65da8b-cbfb-4cc8-8a63-a749d7306185-kube-api-access-hms9p" (OuterVolumeSpecName: "kube-api-access-hms9p") pod "fc65da8b-cbfb-4cc8-8a63-a749d7306185" (UID: "fc65da8b-cbfb-4cc8-8a63-a749d7306185"). InnerVolumeSpecName "kube-api-access-hms9p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.449772 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-util" (OuterVolumeSpecName: "util") pod "fc65da8b-cbfb-4cc8-8a63-a749d7306185" (UID: "fc65da8b-cbfb-4cc8-8a63-a749d7306185"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.538918 5200 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc65da8b-cbfb-4cc8-8a63-a749d7306185-util\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:15 crc kubenswrapper[5200]: I1126 13:44:15.538969 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hms9p\" (UniqueName: \"kubernetes.io/projected/fc65da8b-cbfb-4cc8-8a63-a749d7306185-kube-api-access-hms9p\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:16 crc kubenswrapper[5200]: I1126 13:44:16.004461 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" event={"ID":"fc65da8b-cbfb-4cc8-8a63-a749d7306185","Type":"ContainerDied","Data":"8f7d227a18a1d5cc02b28ecadd256173bdbad08674f008ee0b006bf9db6705f1"} Nov 26 13:44:16 crc kubenswrapper[5200]: I1126 13:44:16.004528 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f7d227a18a1d5cc02b28ecadd256173bdbad08674f008ee0b006bf9db6705f1" Nov 26 13:44:16 crc kubenswrapper[5200]: I1126 13:44:16.004595 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd" Nov 26 13:44:17 crc kubenswrapper[5200]: I1126 13:44:17.751753 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvd56" Nov 26 13:44:18 crc kubenswrapper[5200]: I1126 13:44:18.173859 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:18 crc kubenswrapper[5200]: I1126 13:44:18.173946 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:18 crc kubenswrapper[5200]: I1126 13:44:18.231725 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:19 crc kubenswrapper[5200]: I1126 13:44:19.076144 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:19 crc kubenswrapper[5200]: I1126 13:44:19.144243 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b75mp"] Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.464841 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6757c57d5-xxf6c"] Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.465641 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerName="pull" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.465658 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerName="pull" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.465676 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerName="util" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.465721 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerName="util" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.465740 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerName="extract" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.465748 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerName="extract" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.465866 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc65da8b-cbfb-4cc8-8a63-a749d7306185" containerName="extract" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.517168 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6757c57d5-xxf6c"] Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.517390 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.520027 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"nmstate-operator-dockercfg-g5sjp\"" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.520016 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-nmstate\"/\"openshift-service-ca.crt\"" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.521770 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-nmstate\"/\"kube-root-ca.crt\"" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.630649 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drmss\" (UniqueName: \"kubernetes.io/projected/8a84f308-5641-49b3-8c43-b5d3256d74b8-kube-api-access-drmss\") pod \"nmstate-operator-6757c57d5-xxf6c\" (UID: \"8a84f308-5641-49b3-8c43-b5d3256d74b8\") " pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.732466 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drmss\" (UniqueName: \"kubernetes.io/projected/8a84f308-5641-49b3-8c43-b5d3256d74b8-kube-api-access-drmss\") pod \"nmstate-operator-6757c57d5-xxf6c\" (UID: \"8a84f308-5641-49b3-8c43-b5d3256d74b8\") " pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.754399 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drmss\" (UniqueName: \"kubernetes.io/projected/8a84f308-5641-49b3-8c43-b5d3256d74b8-kube-api-access-drmss\") pod \"nmstate-operator-6757c57d5-xxf6c\" (UID: \"8a84f308-5641-49b3-8c43-b5d3256d74b8\") " pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" Nov 26 13:44:20 crc kubenswrapper[5200]: I1126 13:44:20.839432 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.048269 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b75mp" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerName="registry-server" containerID="cri-o://66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c" gracePeriod=2 Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.093921 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6757c57d5-xxf6c"] Nov 26 13:44:21 crc kubenswrapper[5200]: W1126 13:44:21.102867 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a84f308_5641_49b3_8c43_b5d3256d74b8.slice/crio-7b64af13cdb18729931c9156b4e5558e5a6644bc020d5e09eeef54ba00b6b151 WatchSource:0}: Error finding container 7b64af13cdb18729931c9156b4e5558e5a6644bc020d5e09eeef54ba00b6b151: Status 404 returned error can't find the container with id 7b64af13cdb18729931c9156b4e5558e5a6644bc020d5e09eeef54ba00b6b151 Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.426993 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.542225 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-catalog-content\") pod \"4659f491-d71a-4c0f-ab82-9119d4facb07\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.542386 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzkl\" (UniqueName: \"kubernetes.io/projected/4659f491-d71a-4c0f-ab82-9119d4facb07-kube-api-access-mfzkl\") pod \"4659f491-d71a-4c0f-ab82-9119d4facb07\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.542434 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-utilities\") pod \"4659f491-d71a-4c0f-ab82-9119d4facb07\" (UID: \"4659f491-d71a-4c0f-ab82-9119d4facb07\") " Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.543927 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-utilities" (OuterVolumeSpecName: "utilities") pod "4659f491-d71a-4c0f-ab82-9119d4facb07" (UID: "4659f491-d71a-4c0f-ab82-9119d4facb07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.549797 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4659f491-d71a-4c0f-ab82-9119d4facb07-kube-api-access-mfzkl" (OuterVolumeSpecName: "kube-api-access-mfzkl") pod "4659f491-d71a-4c0f-ab82-9119d4facb07" (UID: "4659f491-d71a-4c0f-ab82-9119d4facb07"). InnerVolumeSpecName "kube-api-access-mfzkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.633814 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4659f491-d71a-4c0f-ab82-9119d4facb07" (UID: "4659f491-d71a-4c0f-ab82-9119d4facb07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.644252 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.644295 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfzkl\" (UniqueName: \"kubernetes.io/projected/4659f491-d71a-4c0f-ab82-9119d4facb07-kube-api-access-mfzkl\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:21 crc kubenswrapper[5200]: I1126 13:44:21.644312 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4659f491-d71a-4c0f-ab82-9119d4facb07-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.056211 5200 generic.go:358] "Generic (PLEG): container finished" podID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerID="66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c" exitCode=0 Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.056336 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b75mp" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.056319 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b75mp" event={"ID":"4659f491-d71a-4c0f-ab82-9119d4facb07","Type":"ContainerDied","Data":"66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c"} Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.056894 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b75mp" event={"ID":"4659f491-d71a-4c0f-ab82-9119d4facb07","Type":"ContainerDied","Data":"62458a6aabfa96c39da58e9dea762016f4087490ef9c192f3602e8123c5ac6d0"} Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.056924 5200 scope.go:117] "RemoveContainer" containerID="66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.066889 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" event={"ID":"8a84f308-5641-49b3-8c43-b5d3256d74b8","Type":"ContainerStarted","Data":"7b64af13cdb18729931c9156b4e5558e5a6644bc020d5e09eeef54ba00b6b151"} Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.088933 5200 scope.go:117] "RemoveContainer" containerID="cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.093116 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b75mp"] Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.100548 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b75mp"] Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.111262 5200 scope.go:117] "RemoveContainer" containerID="95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.125714 5200 scope.go:117] "RemoveContainer" containerID="66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c" Nov 26 13:44:22 crc kubenswrapper[5200]: E1126 13:44:22.126964 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c\": container with ID starting with 66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c not found: ID does not exist" containerID="66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.127002 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c"} err="failed to get container status \"66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c\": rpc error: code = NotFound desc = could not find container \"66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c\": container with ID starting with 66e3c2e35e3257572715583439d7ccc831b8aa04cedc1d5103880a71ac2fdb9c not found: ID does not exist" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.127025 5200 scope.go:117] "RemoveContainer" containerID="cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2" Nov 26 13:44:22 crc kubenswrapper[5200]: E1126 13:44:22.127604 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2\": container with ID starting with cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2 not found: ID does not exist" containerID="cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.127711 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2"} err="failed to get container status \"cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2\": rpc error: code = NotFound desc = could not find container \"cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2\": container with ID starting with cd211e6b24def3e6e188a00007badf8a06e2f4b745e30070df69de9adc0b1ab2 not found: ID does not exist" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.127777 5200 scope.go:117] "RemoveContainer" containerID="95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc" Nov 26 13:44:22 crc kubenswrapper[5200]: E1126 13:44:22.128148 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc\": container with ID starting with 95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc not found: ID does not exist" containerID="95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc" Nov 26 13:44:22 crc kubenswrapper[5200]: I1126 13:44:22.128173 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc"} err="failed to get container status \"95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc\": rpc error: code = NotFound desc = could not find container \"95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc\": container with ID starting with 95c85121cdeca5fcfe674328ae55409b3094fa8957dcbd9546b6eac80f315afc not found: ID does not exist" Nov 26 13:44:23 crc kubenswrapper[5200]: I1126 13:44:23.004876 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" path="/var/lib/kubelet/pods/4659f491-d71a-4c0f-ab82-9119d4facb07/volumes" Nov 26 13:44:25 crc kubenswrapper[5200]: I1126 13:44:25.912091 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d9d95bf5b-jdphn" Nov 26 13:44:25 crc kubenswrapper[5200]: I1126 13:44:25.990774 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-k5pvc"] Nov 26 13:44:28 crc kubenswrapper[5200]: I1126 13:44:28.121386 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" event={"ID":"8a84f308-5641-49b3-8c43-b5d3256d74b8","Type":"ContainerStarted","Data":"58089bb4f73e2256a89631144ad025d1c1461f197ed84b4d5e108abd0a8d60d8"} Nov 26 13:44:28 crc kubenswrapper[5200]: I1126 13:44:28.121972 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" Nov 26 13:44:28 crc kubenswrapper[5200]: I1126 13:44:28.148366 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" podStartSLOduration=2.259011448 podStartE2EDuration="8.148335974s" podCreationTimestamp="2025-11-26 13:44:20 +0000 UTC" firstStartedPulling="2025-11-26 13:44:21.10771692 +0000 UTC m=+829.786918738" lastFinishedPulling="2025-11-26 13:44:26.997041446 +0000 UTC m=+835.676243264" observedRunningTime="2025-11-26 13:44:28.145679511 +0000 UTC m=+836.824881409" watchObservedRunningTime="2025-11-26 13:44:28.148335974 +0000 UTC m=+836.827537872" Nov 26 13:44:39 crc kubenswrapper[5200]: I1126 13:44:39.132462 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-operator-6757c57d5-xxf6c" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.248342 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-75c64559db-98h79"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.250850 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerName="registry-server" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.250986 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerName="registry-server" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.251102 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerName="extract-utilities" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.251182 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerName="extract-utilities" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.251276 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerName="extract-content" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.251350 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerName="extract-content" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.251556 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4659f491-d71a-4c0f-ab82-9119d4facb07" containerName="registry-server" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.256147 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.259779 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-f842z\"" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.267092 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.274760 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.277111 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"openshift-nmstate-webhook\"" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.292266 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-65tl2"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.296666 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-75c64559db-98h79"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.296859 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.303728 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.355194 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nzdn\" (UniqueName: \"kubernetes.io/projected/55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b-kube-api-access-8nzdn\") pod \"nmstate-metrics-75c64559db-98h79\" (UID: \"55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b\") " pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.355263 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vz2q\" (UniqueName: \"kubernetes.io/projected/a74eea18-7f92-4c34-a977-fe0298221149-kube-api-access-9vz2q\") pod \"nmstate-webhook-6dccbdf6bd-x6pb5\" (UID: \"a74eea18-7f92-4c34-a977-fe0298221149\") " pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.355312 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a74eea18-7f92-4c34-a977-fe0298221149-tls-key-pair\") pod \"nmstate-webhook-6dccbdf6bd-x6pb5\" (UID: \"a74eea18-7f92-4c34-a977-fe0298221149\") " pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.433635 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.439646 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.441553 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"default-dockercfg-qm2c4\"" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.442139 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-nmstate\"/\"nginx-conf\"" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.442453 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"plugin-serving-cert\"" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.450977 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.457049 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vz2q\" (UniqueName: \"kubernetes.io/projected/a74eea18-7f92-4c34-a977-fe0298221149-kube-api-access-9vz2q\") pod \"nmstate-webhook-6dccbdf6bd-x6pb5\" (UID: \"a74eea18-7f92-4c34-a977-fe0298221149\") " pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.457114 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a74eea18-7f92-4c34-a977-fe0298221149-tls-key-pair\") pod \"nmstate-webhook-6dccbdf6bd-x6pb5\" (UID: \"a74eea18-7f92-4c34-a977-fe0298221149\") " pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.457911 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-nmstate-lock\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.457960 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbr2d\" (UniqueName: \"kubernetes.io/projected/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-kube-api-access-qbr2d\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.458234 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-dbus-socket\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.458278 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-ovs-socket\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.458353 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nzdn\" (UniqueName: \"kubernetes.io/projected/55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b-kube-api-access-8nzdn\") pod \"nmstate-metrics-75c64559db-98h79\" (UID: \"55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b\") " pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.471743 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a74eea18-7f92-4c34-a977-fe0298221149-tls-key-pair\") pod \"nmstate-webhook-6dccbdf6bd-x6pb5\" (UID: \"a74eea18-7f92-4c34-a977-fe0298221149\") " pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.476291 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vz2q\" (UniqueName: \"kubernetes.io/projected/a74eea18-7f92-4c34-a977-fe0298221149-kube-api-access-9vz2q\") pod \"nmstate-webhook-6dccbdf6bd-x6pb5\" (UID: \"a74eea18-7f92-4c34-a977-fe0298221149\") " pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.477747 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nzdn\" (UniqueName: \"kubernetes.io/projected/55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b-kube-api-access-8nzdn\") pod \"nmstate-metrics-75c64559db-98h79\" (UID: \"55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b\") " pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.560483 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/972422a2-6a6a-4def-bdfb-c620ac76ae3b-plugin-serving-cert\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.560573 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb2mv\" (UniqueName: \"kubernetes.io/projected/972422a2-6a6a-4def-bdfb-c620ac76ae3b-kube-api-access-wb2mv\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.560677 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-nmstate-lock\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.560739 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbr2d\" (UniqueName: \"kubernetes.io/projected/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-kube-api-access-qbr2d\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.560833 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-dbus-socket\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.560866 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-ovs-socket\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.560875 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-nmstate-lock\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.560916 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/972422a2-6a6a-4def-bdfb-c620ac76ae3b-nginx-conf\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.561029 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-ovs-socket\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.561429 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-dbus-socket\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.583644 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.590611 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbr2d\" (UniqueName: \"kubernetes.io/projected/50b5e447-f56c-46cb-a3d7-fc31889a8dd0-kube-api-access-qbr2d\") pod \"nmstate-handler-65tl2\" (UID: \"50b5e447-f56c-46cb-a3d7-fc31889a8dd0\") " pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.603492 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.621187 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.649152 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-586f75c45b-2f5c7"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.662601 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/972422a2-6a6a-4def-bdfb-c620ac76ae3b-nginx-conf\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.662775 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/972422a2-6a6a-4def-bdfb-c620ac76ae3b-plugin-serving-cert\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.662810 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb2mv\" (UniqueName: \"kubernetes.io/projected/972422a2-6a6a-4def-bdfb-c620ac76ae3b-kube-api-access-wb2mv\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.663871 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/972422a2-6a6a-4def-bdfb-c620ac76ae3b-nginx-conf\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.665862 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.670111 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-586f75c45b-2f5c7"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.673424 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/972422a2-6a6a-4def-bdfb-c620ac76ae3b-plugin-serving-cert\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.694610 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb2mv\" (UniqueName: \"kubernetes.io/projected/972422a2-6a6a-4def-bdfb-c620ac76ae3b-kube-api-access-wb2mv\") pod \"nmstate-console-plugin-8b88fbd7-ffhn9\" (UID: \"972422a2-6a6a-4def-bdfb-c620ac76ae3b\") " pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.764141 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-trusted-ca-bundle\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.764235 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-oauth-serving-cert\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.764290 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvn7r\" (UniqueName: \"kubernetes.io/projected/e2a69765-6941-4787-b2d6-8f0b391d1f61-kube-api-access-rvn7r\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.764360 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-serving-cert\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.764386 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-service-ca\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.764413 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-oauth-config\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.764437 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-config\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.769394 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.852674 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5"] Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.907820 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-serving-cert\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.908456 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-service-ca\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.908492 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-oauth-config\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.908551 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-config\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.908617 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-trusted-ca-bundle\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.908664 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-oauth-serving-cert\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.908829 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvn7r\" (UniqueName: \"kubernetes.io/projected/e2a69765-6941-4787-b2d6-8f0b391d1f61-kube-api-access-rvn7r\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.909391 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-config\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.910138 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-oauth-serving-cert\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.910885 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-service-ca\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.911125 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2a69765-6941-4787-b2d6-8f0b391d1f61-trusted-ca-bundle\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.914419 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-serving-cert\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.920884 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2a69765-6941-4787-b2d6-8f0b391d1f61-console-oauth-config\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.932936 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvn7r\" (UniqueName: \"kubernetes.io/projected/e2a69765-6941-4787-b2d6-8f0b391d1f61-kube-api-access-rvn7r\") pod \"console-586f75c45b-2f5c7\" (UID: \"e2a69765-6941-4787-b2d6-8f0b391d1f61\") " pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:40 crc kubenswrapper[5200]: I1126 13:44:40.958448 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-75c64559db-98h79"] Nov 26 13:44:41 crc kubenswrapper[5200]: I1126 13:44:41.003183 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:41 crc kubenswrapper[5200]: I1126 13:44:41.030044 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9"] Nov 26 13:44:41 crc kubenswrapper[5200]: I1126 13:44:41.217459 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" event={"ID":"972422a2-6a6a-4def-bdfb-c620ac76ae3b","Type":"ContainerStarted","Data":"dcf3a6fa6d79559eab1fca0e7c95634d1895f2f82fa1e9245be9e631b2b190ef"} Nov 26 13:44:41 crc kubenswrapper[5200]: I1126 13:44:41.219160 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" event={"ID":"55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b","Type":"ContainerStarted","Data":"ee45359886c52de62521f447462a3788b6d23ff0fef614e5ce600dac3fa7f3a3"} Nov 26 13:44:41 crc kubenswrapper[5200]: I1126 13:44:41.221186 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-65tl2" event={"ID":"50b5e447-f56c-46cb-a3d7-fc31889a8dd0","Type":"ContainerStarted","Data":"1cba4335cff2682c10172acfe1fad36a1361c7871ea703724d502d0e9a9a14f4"} Nov 26 13:44:41 crc kubenswrapper[5200]: I1126 13:44:41.224079 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" event={"ID":"a74eea18-7f92-4c34-a977-fe0298221149","Type":"ContainerStarted","Data":"aaff487b0ea280af1e87801d4fcd0e20868bec3ac72b43f8716e237e895f4348"} Nov 26 13:44:41 crc kubenswrapper[5200]: I1126 13:44:41.258881 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-586f75c45b-2f5c7"] Nov 26 13:44:41 crc kubenswrapper[5200]: W1126 13:44:41.266923 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2a69765_6941_4787_b2d6_8f0b391d1f61.slice/crio-c3ddc641169f7dac4c2c5f611595bf749f31b6f552aa2635db86b07539464f23 WatchSource:0}: Error finding container c3ddc641169f7dac4c2c5f611595bf749f31b6f552aa2635db86b07539464f23: Status 404 returned error can't find the container with id c3ddc641169f7dac4c2c5f611595bf749f31b6f552aa2635db86b07539464f23 Nov 26 13:44:42 crc kubenswrapper[5200]: I1126 13:44:42.236849 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-586f75c45b-2f5c7" event={"ID":"e2a69765-6941-4787-b2d6-8f0b391d1f61","Type":"ContainerStarted","Data":"4e66b4fdd88b71183a5db95246e204d0380f5fbd0300c9fd7c03c644ce25fa78"} Nov 26 13:44:42 crc kubenswrapper[5200]: I1126 13:44:42.237350 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-586f75c45b-2f5c7" event={"ID":"e2a69765-6941-4787-b2d6-8f0b391d1f61","Type":"ContainerStarted","Data":"c3ddc641169f7dac4c2c5f611595bf749f31b6f552aa2635db86b07539464f23"} Nov 26 13:44:42 crc kubenswrapper[5200]: I1126 13:44:42.265502 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-586f75c45b-2f5c7" podStartSLOduration=2.265459488 podStartE2EDuration="2.265459488s" podCreationTimestamp="2025-11-26 13:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:44:42.264098684 +0000 UTC m=+850.943300552" watchObservedRunningTime="2025-11-26 13:44:42.265459488 +0000 UTC m=+850.944661346" Nov 26 13:44:45 crc kubenswrapper[5200]: I1126 13:44:45.265325 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" event={"ID":"55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b","Type":"ContainerStarted","Data":"f909a347f4827f1947b4dfeceedfb6055ecc3d72f2726a8b7b6d0f8505abdd13"} Nov 26 13:44:45 crc kubenswrapper[5200]: I1126 13:44:45.267402 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-65tl2" event={"ID":"50b5e447-f56c-46cb-a3d7-fc31889a8dd0","Type":"ContainerStarted","Data":"e116f38c5c4af8b32226069c0a680c27fd5c91033276b0ceb840ba605cf0ba01"} Nov 26 13:44:45 crc kubenswrapper[5200]: I1126 13:44:45.269936 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" event={"ID":"a74eea18-7f92-4c34-a977-fe0298221149","Type":"ContainerStarted","Data":"9e1488702b7e80c9ac3b3f17a3e51c076081b25878a18aaac6f97cd1b692e55f"} Nov 26 13:44:45 crc kubenswrapper[5200]: I1126 13:44:45.270179 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:45 crc kubenswrapper[5200]: I1126 13:44:45.273120 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" event={"ID":"972422a2-6a6a-4def-bdfb-c620ac76ae3b","Type":"ContainerStarted","Data":"00875fa679ac513c990f5ff26712baeb6cc4f1cd3741f934ed279668f87a45df"} Nov 26 13:44:45 crc kubenswrapper[5200]: I1126 13:44:45.273495 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:45 crc kubenswrapper[5200]: I1126 13:44:45.300772 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" podStartSLOduration=2.090815954 podStartE2EDuration="5.300748551s" podCreationTimestamp="2025-11-26 13:44:40 +0000 UTC" firstStartedPulling="2025-11-26 13:44:40.918244415 +0000 UTC m=+849.597446233" lastFinishedPulling="2025-11-26 13:44:44.128176982 +0000 UTC m=+852.807378830" observedRunningTime="2025-11-26 13:44:45.294396687 +0000 UTC m=+853.973598545" watchObservedRunningTime="2025-11-26 13:44:45.300748551 +0000 UTC m=+853.979950379" Nov 26 13:44:45 crc kubenswrapper[5200]: I1126 13:44:45.325561 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" podStartSLOduration=2.2793708 podStartE2EDuration="5.325534399s" podCreationTimestamp="2025-11-26 13:44:40 +0000 UTC" firstStartedPulling="2025-11-26 13:44:41.047368157 +0000 UTC m=+849.726569975" lastFinishedPulling="2025-11-26 13:44:44.093531746 +0000 UTC m=+852.772733574" observedRunningTime="2025-11-26 13:44:45.321308203 +0000 UTC m=+854.000510091" watchObservedRunningTime="2025-11-26 13:44:45.325534399 +0000 UTC m=+854.004736257" Nov 26 13:44:46 crc kubenswrapper[5200]: I1126 13:44:46.281014 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:46 crc kubenswrapper[5200]: I1126 13:44:46.308651 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-65tl2" podStartSLOduration=2.8774833319999997 podStartE2EDuration="6.308586684s" podCreationTimestamp="2025-11-26 13:44:40 +0000 UTC" firstStartedPulling="2025-11-26 13:44:40.665319596 +0000 UTC m=+849.344521414" lastFinishedPulling="2025-11-26 13:44:44.096422908 +0000 UTC m=+852.775624766" observedRunningTime="2025-11-26 13:44:46.303157226 +0000 UTC m=+854.982359064" watchObservedRunningTime="2025-11-26 13:44:46.308586684 +0000 UTC m=+854.987788542" Nov 26 13:44:47 crc kubenswrapper[5200]: I1126 13:44:47.291981 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" event={"ID":"55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b","Type":"ContainerStarted","Data":"93174b9f11db47ace252b95557e12c2d274837f90d71460ffc42f1d7d4962149"} Nov 26 13:44:47 crc kubenswrapper[5200]: I1126 13:44:47.322253 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" podStartSLOduration=1.22130934 podStartE2EDuration="7.322232333s" podCreationTimestamp="2025-11-26 13:44:40 +0000 UTC" firstStartedPulling="2025-11-26 13:44:40.966951395 +0000 UTC m=+849.646153213" lastFinishedPulling="2025-11-26 13:44:47.067874388 +0000 UTC m=+855.747076206" observedRunningTime="2025-11-26 13:44:47.317607039 +0000 UTC m=+855.996808867" watchObservedRunningTime="2025-11-26 13:44:47.322232333 +0000 UTC m=+856.001434151" Nov 26 13:44:48 crc kubenswrapper[5200]: I1126 13:44:48.301424 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.004827 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.007003 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.013347 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.054800 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" podUID="aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" containerName="registry" containerID="cri-o://a1e44980bc0b4befff3a2cc4569d65ef710ef19fa6c0f5e45c6bf4eecacfb055" gracePeriod=30 Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.344056 5200 generic.go:358] "Generic (PLEG): container finished" podID="aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" containerID="a1e44980bc0b4befff3a2cc4569d65ef710ef19fa6c0f5e45c6bf4eecacfb055" exitCode=0 Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.346328 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" event={"ID":"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa","Type":"ContainerDied","Data":"a1e44980bc0b4befff3a2cc4569d65ef710ef19fa6c0f5e45c6bf4eecacfb055"} Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.355322 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-586f75c45b-2f5c7" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.430416 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d44f6ddf-vvkzj"] Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.511099 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.635025 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-bound-sa-token\") pod \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.635557 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.635738 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-installation-pull-secrets\") pod \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.635831 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-certificates\") pod \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.636017 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-trusted-ca\") pod \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.636115 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-ca-trust-extracted\") pod \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.636235 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-tls\") pod \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.636339 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h2zj\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-kube-api-access-7h2zj\") pod \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\" (UID: \"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa\") " Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.637006 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.637157 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.644104 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.645164 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.646062 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-kube-api-access-7h2zj" (OuterVolumeSpecName: "kube-api-access-7h2zj") pod "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa"). InnerVolumeSpecName "kube-api-access-7h2zj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.650296 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (OuterVolumeSpecName: "registry-storage") pod "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa"). InnerVolumeSpecName "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2". PluginName "kubernetes.io/csi", VolumeGIDValue "" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.656456 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.658422 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" (UID: "aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.738429 5200 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.738483 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7h2zj\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-kube-api-access-7h2zj\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.738498 5200 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.738508 5200 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.738518 5200 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.738527 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:51 crc kubenswrapper[5200]: I1126 13:44:51.738536 5200 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 13:44:52 crc kubenswrapper[5200]: I1126 13:44:52.337459 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-65tl2" Nov 26 13:44:52 crc kubenswrapper[5200]: I1126 13:44:52.354136 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" event={"ID":"aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa","Type":"ContainerDied","Data":"8133525f49bd2c01485a588810e93109ccd0e3081f2c8fc5a8899887f8fd03d5"} Nov 26 13:44:52 crc kubenswrapper[5200]: I1126 13:44:52.354197 5200 scope.go:117] "RemoveContainer" containerID="a1e44980bc0b4befff3a2cc4569d65ef710ef19fa6c0f5e45c6bf4eecacfb055" Nov 26 13:44:52 crc kubenswrapper[5200]: I1126 13:44:52.354194 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-k5pvc" Nov 26 13:44:52 crc kubenswrapper[5200]: I1126 13:44:52.431948 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-k5pvc"] Nov 26 13:44:52 crc kubenswrapper[5200]: I1126 13:44:52.441275 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-k5pvc"] Nov 26 13:44:52 crc kubenswrapper[5200]: I1126 13:44:52.985533 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" path="/var/lib/kubelet/pods/aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa/volumes" Nov 26 13:44:56 crc kubenswrapper[5200]: I1126 13:44:56.284909 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-console-plugin-8b88fbd7-ffhn9" Nov 26 13:44:56 crc kubenswrapper[5200]: I1126 13:44:56.289038 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6dccbdf6bd-x6pb5" Nov 26 13:44:59 crc kubenswrapper[5200]: I1126 13:44:59.315550 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-metrics-75c64559db-98h79" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.147356 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg"] Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.148663 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" containerName="registry" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.148716 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" containerName="registry" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.148899 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="aae19c49-1c67-4da5-b8a0-da6a4fd4f2fa" containerName="registry" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.167042 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg"] Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.168887 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.174757 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.179119 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.335893 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74bx\" (UniqueName: \"kubernetes.io/projected/0714c26b-3adf-46af-81e6-44003392d822-kube-api-access-c74bx\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.335956 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0714c26b-3adf-46af-81e6-44003392d822-secret-volume\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.336004 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0714c26b-3adf-46af-81e6-44003392d822-config-volume\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.437983 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c74bx\" (UniqueName: \"kubernetes.io/projected/0714c26b-3adf-46af-81e6-44003392d822-kube-api-access-c74bx\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.438060 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0714c26b-3adf-46af-81e6-44003392d822-secret-volume\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.438127 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0714c26b-3adf-46af-81e6-44003392d822-config-volume\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.439146 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0714c26b-3adf-46af-81e6-44003392d822-config-volume\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.446510 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0714c26b-3adf-46af-81e6-44003392d822-secret-volume\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.467108 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74bx\" (UniqueName: \"kubernetes.io/projected/0714c26b-3adf-46af-81e6-44003392d822-kube-api-access-c74bx\") pod \"collect-profiles-29402745-ccbbg\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.497520 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:00 crc kubenswrapper[5200]: I1126 13:45:00.815035 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg"] Nov 26 13:45:01 crc kubenswrapper[5200]: I1126 13:45:01.432789 5200 generic.go:358] "Generic (PLEG): container finished" podID="0714c26b-3adf-46af-81e6-44003392d822" containerID="8159c589137cfaf476da2f9dd08bd77a722b9a7f11a49087e00ef5bffe4c6dc9" exitCode=0 Nov 26 13:45:01 crc kubenswrapper[5200]: I1126 13:45:01.432861 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" event={"ID":"0714c26b-3adf-46af-81e6-44003392d822","Type":"ContainerDied","Data":"8159c589137cfaf476da2f9dd08bd77a722b9a7f11a49087e00ef5bffe4c6dc9"} Nov 26 13:45:01 crc kubenswrapper[5200]: I1126 13:45:01.432945 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" event={"ID":"0714c26b-3adf-46af-81e6-44003392d822","Type":"ContainerStarted","Data":"d319e6fc34e71633ed3ed06ce6f08aa2d6478b42f70819cbf3d3224001305d26"} Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.701818 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.878536 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0714c26b-3adf-46af-81e6-44003392d822-secret-volume\") pod \"0714c26b-3adf-46af-81e6-44003392d822\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.878767 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c74bx\" (UniqueName: \"kubernetes.io/projected/0714c26b-3adf-46af-81e6-44003392d822-kube-api-access-c74bx\") pod \"0714c26b-3adf-46af-81e6-44003392d822\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.878845 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0714c26b-3adf-46af-81e6-44003392d822-config-volume\") pod \"0714c26b-3adf-46af-81e6-44003392d822\" (UID: \"0714c26b-3adf-46af-81e6-44003392d822\") " Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.880232 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0714c26b-3adf-46af-81e6-44003392d822-config-volume" (OuterVolumeSpecName: "config-volume") pod "0714c26b-3adf-46af-81e6-44003392d822" (UID: "0714c26b-3adf-46af-81e6-44003392d822"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.885900 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0714c26b-3adf-46af-81e6-44003392d822-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0714c26b-3adf-46af-81e6-44003392d822" (UID: "0714c26b-3adf-46af-81e6-44003392d822"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.887726 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0714c26b-3adf-46af-81e6-44003392d822-kube-api-access-c74bx" (OuterVolumeSpecName: "kube-api-access-c74bx") pod "0714c26b-3adf-46af-81e6-44003392d822" (UID: "0714c26b-3adf-46af-81e6-44003392d822"). InnerVolumeSpecName "kube-api-access-c74bx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.980042 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c74bx\" (UniqueName: \"kubernetes.io/projected/0714c26b-3adf-46af-81e6-44003392d822-kube-api-access-c74bx\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.980082 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0714c26b-3adf-46af-81e6-44003392d822-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:02 crc kubenswrapper[5200]: I1126 13:45:02.980092 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0714c26b-3adf-46af-81e6-44003392d822-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:03 crc kubenswrapper[5200]: I1126 13:45:03.457179 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" event={"ID":"0714c26b-3adf-46af-81e6-44003392d822","Type":"ContainerDied","Data":"d319e6fc34e71633ed3ed06ce6f08aa2d6478b42f70819cbf3d3224001305d26"} Nov 26 13:45:03 crc kubenswrapper[5200]: I1126 13:45:03.457255 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d319e6fc34e71633ed3ed06ce6f08aa2d6478b42f70819cbf3d3224001305d26" Nov 26 13:45:03 crc kubenswrapper[5200]: I1126 13:45:03.457479 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg" Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.850757 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8rd67"] Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.853013 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0714c26b-3adf-46af-81e6-44003392d822" containerName="collect-profiles" Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.853133 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0714c26b-3adf-46af-81e6-44003392d822" containerName="collect-profiles" Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.853327 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0714c26b-3adf-46af-81e6-44003392d822" containerName="collect-profiles" Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.868238 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.870793 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rd67"] Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.911312 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-catalog-content\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.911377 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxkl\" (UniqueName: \"kubernetes.io/projected/aec2f55c-b101-4413-809e-ba41a4a16d96-kube-api-access-kmxkl\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:04 crc kubenswrapper[5200]: I1126 13:45:04.911416 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-utilities\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:05 crc kubenswrapper[5200]: I1126 13:45:05.012671 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxkl\" (UniqueName: \"kubernetes.io/projected/aec2f55c-b101-4413-809e-ba41a4a16d96-kube-api-access-kmxkl\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:05 crc kubenswrapper[5200]: I1126 13:45:05.013643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-utilities\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:05 crc kubenswrapper[5200]: I1126 13:45:05.014003 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-utilities\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:05 crc kubenswrapper[5200]: I1126 13:45:05.014561 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-catalog-content\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:05 crc kubenswrapper[5200]: I1126 13:45:05.014916 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-catalog-content\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:05 crc kubenswrapper[5200]: I1126 13:45:05.038339 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxkl\" (UniqueName: \"kubernetes.io/projected/aec2f55c-b101-4413-809e-ba41a4a16d96-kube-api-access-kmxkl\") pod \"redhat-marketplace-8rd67\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:05 crc kubenswrapper[5200]: I1126 13:45:05.188911 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:05 crc kubenswrapper[5200]: I1126 13:45:05.604327 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rd67"] Nov 26 13:45:05 crc kubenswrapper[5200]: W1126 13:45:05.609118 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaec2f55c_b101_4413_809e_ba41a4a16d96.slice/crio-2d78d8546200e5d6772469bf27e33ad52f8f686d53c11b8a2edb27f1ea6e7803 WatchSource:0}: Error finding container 2d78d8546200e5d6772469bf27e33ad52f8f686d53c11b8a2edb27f1ea6e7803: Status 404 returned error can't find the container with id 2d78d8546200e5d6772469bf27e33ad52f8f686d53c11b8a2edb27f1ea6e7803 Nov 26 13:45:06 crc kubenswrapper[5200]: I1126 13:45:06.488079 5200 generic.go:358] "Generic (PLEG): container finished" podID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerID="f51ea28c1a2e2ea2b10b042d1629d552c68752c0d8686e91353b3366d00bde10" exitCode=0 Nov 26 13:45:06 crc kubenswrapper[5200]: I1126 13:45:06.488129 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rd67" event={"ID":"aec2f55c-b101-4413-809e-ba41a4a16d96","Type":"ContainerDied","Data":"f51ea28c1a2e2ea2b10b042d1629d552c68752c0d8686e91353b3366d00bde10"} Nov 26 13:45:06 crc kubenswrapper[5200]: I1126 13:45:06.488915 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rd67" event={"ID":"aec2f55c-b101-4413-809e-ba41a4a16d96","Type":"ContainerStarted","Data":"2d78d8546200e5d6772469bf27e33ad52f8f686d53c11b8a2edb27f1ea6e7803"} Nov 26 13:45:08 crc kubenswrapper[5200]: I1126 13:45:08.512123 5200 generic.go:358] "Generic (PLEG): container finished" podID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerID="7aa7a3792efb93c413a9b858d528d3d111ec217007c9b2e58e558e91115763f7" exitCode=0 Nov 26 13:45:08 crc kubenswrapper[5200]: I1126 13:45:08.512358 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rd67" event={"ID":"aec2f55c-b101-4413-809e-ba41a4a16d96","Type":"ContainerDied","Data":"7aa7a3792efb93c413a9b858d528d3d111ec217007c9b2e58e558e91115763f7"} Nov 26 13:45:09 crc kubenswrapper[5200]: I1126 13:45:09.527160 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rd67" event={"ID":"aec2f55c-b101-4413-809e-ba41a4a16d96","Type":"ContainerStarted","Data":"7066f91c7971bc48952a376a2c3f9fff82a0ff0d13c799584349861560f80cf5"} Nov 26 13:45:09 crc kubenswrapper[5200]: I1126 13:45:09.565464 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8rd67" podStartSLOduration=4.672878095 podStartE2EDuration="5.565443497s" podCreationTimestamp="2025-11-26 13:45:04 +0000 UTC" firstStartedPulling="2025-11-26 13:45:06.489142033 +0000 UTC m=+875.168343861" lastFinishedPulling="2025-11-26 13:45:07.381707435 +0000 UTC m=+876.060909263" observedRunningTime="2025-11-26 13:45:09.555604619 +0000 UTC m=+878.234806447" watchObservedRunningTime="2025-11-26 13:45:09.565443497 +0000 UTC m=+878.244645315" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.434545 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kbfg4"] Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.447757 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.450637 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbfg4"] Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.469989 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-catalog-content\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.470102 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-utilities\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.470191 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzwz\" (UniqueName: \"kubernetes.io/projected/d87a82c4-8157-4640-b90a-42827c4232eb-kube-api-access-sdzwz\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.572519 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzwz\" (UniqueName: \"kubernetes.io/projected/d87a82c4-8157-4640-b90a-42827c4232eb-kube-api-access-sdzwz\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.572694 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-catalog-content\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.572763 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-utilities\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.573339 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-catalog-content\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.573421 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-utilities\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.599103 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzwz\" (UniqueName: \"kubernetes.io/projected/d87a82c4-8157-4640-b90a-42827c4232eb-kube-api-access-sdzwz\") pod \"community-operators-kbfg4\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:12 crc kubenswrapper[5200]: I1126 13:45:12.773946 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.064352 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbfg4"] Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.568511 5200 generic.go:358] "Generic (PLEG): container finished" podID="d87a82c4-8157-4640-b90a-42827c4232eb" containerID="093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc" exitCode=0 Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.568737 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbfg4" event={"ID":"d87a82c4-8157-4640-b90a-42827c4232eb","Type":"ContainerDied","Data":"093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc"} Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.568830 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbfg4" event={"ID":"d87a82c4-8157-4640-b90a-42827c4232eb","Type":"ContainerStarted","Data":"6cd9be5db3b26d2d11ebe20a0571510d511fc17f0e545a9a3784e5c210a81aa1"} Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.688004 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg"] Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.693895 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.696771 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.706955 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg"] Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.856480 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-bundle\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.857203 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-util\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.857321 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28rbj\" (UniqueName: \"kubernetes.io/projected/40053995-8fb5-4ef3-a6c1-06570eaa42cb-kube-api-access-28rbj\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.959792 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28rbj\" (UniqueName: \"kubernetes.io/projected/40053995-8fb5-4ef3-a6c1-06570eaa42cb-kube-api-access-28rbj\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.959985 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-bundle\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.960036 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-util\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.960843 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-util\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:13 crc kubenswrapper[5200]: I1126 13:45:13.961469 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-bundle\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:14 crc kubenswrapper[5200]: I1126 13:45:14.002519 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28rbj\" (UniqueName: \"kubernetes.io/projected/40053995-8fb5-4ef3-a6c1-06570eaa42cb-kube-api-access-28rbj\") pod \"546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:14 crc kubenswrapper[5200]: I1126 13:45:14.028668 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:14 crc kubenswrapper[5200]: I1126 13:45:14.271950 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg"] Nov 26 13:45:14 crc kubenswrapper[5200]: W1126 13:45:14.291813 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40053995_8fb5_4ef3_a6c1_06570eaa42cb.slice/crio-80c1e9b7542eda3717be2ca5b01415e4948ed57450ff025a7d71b68832cfa99e WatchSource:0}: Error finding container 80c1e9b7542eda3717be2ca5b01415e4948ed57450ff025a7d71b68832cfa99e: Status 404 returned error can't find the container with id 80c1e9b7542eda3717be2ca5b01415e4948ed57450ff025a7d71b68832cfa99e Nov 26 13:45:14 crc kubenswrapper[5200]: I1126 13:45:14.581063 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbfg4" event={"ID":"d87a82c4-8157-4640-b90a-42827c4232eb","Type":"ContainerStarted","Data":"a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c"} Nov 26 13:45:14 crc kubenswrapper[5200]: I1126 13:45:14.588124 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" event={"ID":"40053995-8fb5-4ef3-a6c1-06570eaa42cb","Type":"ContainerStarted","Data":"7ff2908b8ddc851946a0bee3cc543d49054a29e5e2495a03e58fa3f60943e7e2"} Nov 26 13:45:14 crc kubenswrapper[5200]: I1126 13:45:14.588225 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" event={"ID":"40053995-8fb5-4ef3-a6c1-06570eaa42cb","Type":"ContainerStarted","Data":"80c1e9b7542eda3717be2ca5b01415e4948ed57450ff025a7d71b68832cfa99e"} Nov 26 13:45:15 crc kubenswrapper[5200]: I1126 13:45:15.190058 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:15 crc kubenswrapper[5200]: I1126 13:45:15.190148 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:15 crc kubenswrapper[5200]: I1126 13:45:15.260448 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:15 crc kubenswrapper[5200]: I1126 13:45:15.700917 5200 generic.go:358] "Generic (PLEG): container finished" podID="d87a82c4-8157-4640-b90a-42827c4232eb" containerID="a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c" exitCode=0 Nov 26 13:45:15 crc kubenswrapper[5200]: I1126 13:45:15.701072 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbfg4" event={"ID":"d87a82c4-8157-4640-b90a-42827c4232eb","Type":"ContainerDied","Data":"a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c"} Nov 26 13:45:15 crc kubenswrapper[5200]: I1126 13:45:15.708697 5200 generic.go:358] "Generic (PLEG): container finished" podID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerID="7ff2908b8ddc851946a0bee3cc543d49054a29e5e2495a03e58fa3f60943e7e2" exitCode=0 Nov 26 13:45:15 crc kubenswrapper[5200]: I1126 13:45:15.710659 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" event={"ID":"40053995-8fb5-4ef3-a6c1-06570eaa42cb","Type":"ContainerDied","Data":"7ff2908b8ddc851946a0bee3cc543d49054a29e5e2495a03e58fa3f60943e7e2"} Nov 26 13:45:15 crc kubenswrapper[5200]: I1126 13:45:15.815150 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:16 crc kubenswrapper[5200]: I1126 13:45:16.490813 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-console/console-64d44f6ddf-vvkzj" podUID="5e4ba348-184c-436b-87ae-2fddc33c378e" containerName="console" containerID="cri-o://2e649cf23852b660db8e678b7652ebc64a43dad6999b8449fb5aad0703e2bbe5" gracePeriod=15 Nov 26 13:45:16 crc kubenswrapper[5200]: I1126 13:45:16.719553 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d44f6ddf-vvkzj_5e4ba348-184c-436b-87ae-2fddc33c378e/console/0.log" Nov 26 13:45:16 crc kubenswrapper[5200]: I1126 13:45:16.720070 5200 generic.go:358] "Generic (PLEG): container finished" podID="5e4ba348-184c-436b-87ae-2fddc33c378e" containerID="2e649cf23852b660db8e678b7652ebc64a43dad6999b8449fb5aad0703e2bbe5" exitCode=2 Nov 26 13:45:16 crc kubenswrapper[5200]: I1126 13:45:16.720196 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-vvkzj" event={"ID":"5e4ba348-184c-436b-87ae-2fddc33c378e","Type":"ContainerDied","Data":"2e649cf23852b660db8e678b7652ebc64a43dad6999b8449fb5aad0703e2bbe5"} Nov 26 13:45:16 crc kubenswrapper[5200]: I1126 13:45:16.723987 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbfg4" event={"ID":"d87a82c4-8157-4640-b90a-42827c4232eb","Type":"ContainerStarted","Data":"98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846"} Nov 26 13:45:16 crc kubenswrapper[5200]: I1126 13:45:16.748883 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kbfg4" podStartSLOduration=3.950767608 podStartE2EDuration="4.748857083s" podCreationTimestamp="2025-11-26 13:45:12 +0000 UTC" firstStartedPulling="2025-11-26 13:45:13.570871233 +0000 UTC m=+882.250073091" lastFinishedPulling="2025-11-26 13:45:14.368960738 +0000 UTC m=+883.048162566" observedRunningTime="2025-11-26 13:45:16.745085745 +0000 UTC m=+885.424287563" watchObservedRunningTime="2025-11-26 13:45:16.748857083 +0000 UTC m=+885.428058911" Nov 26 13:45:16 crc kubenswrapper[5200]: I1126 13:45:16.967459 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d44f6ddf-vvkzj_5e4ba348-184c-436b-87ae-2fddc33c378e/console/0.log" Nov 26 13:45:16 crc kubenswrapper[5200]: I1126 13:45:16.967563 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.028635 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-console-config\") pod \"5e4ba348-184c-436b-87ae-2fddc33c378e\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.028753 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-serving-cert\") pod \"5e4ba348-184c-436b-87ae-2fddc33c378e\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.028817 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh4f5\" (UniqueName: \"kubernetes.io/projected/5e4ba348-184c-436b-87ae-2fddc33c378e-kube-api-access-bh4f5\") pod \"5e4ba348-184c-436b-87ae-2fddc33c378e\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.028863 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-service-ca\") pod \"5e4ba348-184c-436b-87ae-2fddc33c378e\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.028899 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-oauth-config\") pod \"5e4ba348-184c-436b-87ae-2fddc33c378e\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.028950 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-trusted-ca-bundle\") pod \"5e4ba348-184c-436b-87ae-2fddc33c378e\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.029191 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-oauth-serving-cert\") pod \"5e4ba348-184c-436b-87ae-2fddc33c378e\" (UID: \"5e4ba348-184c-436b-87ae-2fddc33c378e\") " Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.032257 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-console-config" (OuterVolumeSpecName: "console-config") pod "5e4ba348-184c-436b-87ae-2fddc33c378e" (UID: "5e4ba348-184c-436b-87ae-2fddc33c378e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.032672 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5e4ba348-184c-436b-87ae-2fddc33c378e" (UID: "5e4ba348-184c-436b-87ae-2fddc33c378e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.033512 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-service-ca" (OuterVolumeSpecName: "service-ca") pod "5e4ba348-184c-436b-87ae-2fddc33c378e" (UID: "5e4ba348-184c-436b-87ae-2fddc33c378e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.034080 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5e4ba348-184c-436b-87ae-2fddc33c378e" (UID: "5e4ba348-184c-436b-87ae-2fddc33c378e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.054009 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5e4ba348-184c-436b-87ae-2fddc33c378e" (UID: "5e4ba348-184c-436b-87ae-2fddc33c378e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.056630 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4ba348-184c-436b-87ae-2fddc33c378e-kube-api-access-bh4f5" (OuterVolumeSpecName: "kube-api-access-bh4f5") pod "5e4ba348-184c-436b-87ae-2fddc33c378e" (UID: "5e4ba348-184c-436b-87ae-2fddc33c378e"). InnerVolumeSpecName "kube-api-access-bh4f5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.069504 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5e4ba348-184c-436b-87ae-2fddc33c378e" (UID: "5e4ba348-184c-436b-87ae-2fddc33c378e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.131205 5200 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-console-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.131249 5200 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.131263 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bh4f5\" (UniqueName: \"kubernetes.io/projected/5e4ba348-184c-436b-87ae-2fddc33c378e-kube-api-access-bh4f5\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.131279 5200 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.131294 5200 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e4ba348-184c-436b-87ae-2fddc33c378e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.131304 5200 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.131316 5200 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e4ba348-184c-436b-87ae-2fddc33c378e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.737929 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d44f6ddf-vvkzj_5e4ba348-184c-436b-87ae-2fddc33c378e/console/0.log" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.738744 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-vvkzj" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.738755 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-vvkzj" event={"ID":"5e4ba348-184c-436b-87ae-2fddc33c378e","Type":"ContainerDied","Data":"58e07c8d22d8dc4a31c93d564458f59a50385cb0bf30fd83b401a15774d5b434"} Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.738890 5200 scope.go:117] "RemoveContainer" containerID="2e649cf23852b660db8e678b7652ebc64a43dad6999b8449fb5aad0703e2bbe5" Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.752003 5200 generic.go:358] "Generic (PLEG): container finished" podID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerID="d924b023fd274a243ff67c77e03c4aeaf391ff8524d14e26a16ca3ac3a361457" exitCode=0 Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.752121 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" event={"ID":"40053995-8fb5-4ef3-a6c1-06570eaa42cb","Type":"ContainerDied","Data":"d924b023fd274a243ff67c77e03c4aeaf391ff8524d14e26a16ca3ac3a361457"} Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.854790 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d44f6ddf-vvkzj"] Nov 26 13:45:17 crc kubenswrapper[5200]: I1126 13:45:17.856456 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64d44f6ddf-vvkzj"] Nov 26 13:45:18 crc kubenswrapper[5200]: I1126 13:45:18.765897 5200 generic.go:358] "Generic (PLEG): container finished" podID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerID="448427f2d33a61fb20fb32a795f5062aedd01445986f3838d458cdaa22cc7242" exitCode=0 Nov 26 13:45:18 crc kubenswrapper[5200]: I1126 13:45:18.766022 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" event={"ID":"40053995-8fb5-4ef3-a6c1-06570eaa42cb","Type":"ContainerDied","Data":"448427f2d33a61fb20fb32a795f5062aedd01445986f3838d458cdaa22cc7242"} Nov 26 13:45:18 crc kubenswrapper[5200]: I1126 13:45:18.989341 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4ba348-184c-436b-87ae-2fddc33c378e" path="/var/lib/kubelet/pods/5e4ba348-184c-436b-87ae-2fddc33c378e/volumes" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.201746 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rd67"] Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.202092 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8rd67" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerName="registry-server" containerID="cri-o://7066f91c7971bc48952a376a2c3f9fff82a0ff0d13c799584349861560f80cf5" gracePeriod=2 Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.203107 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.288839 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-bundle\") pod \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.289052 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28rbj\" (UniqueName: \"kubernetes.io/projected/40053995-8fb5-4ef3-a6c1-06570eaa42cb-kube-api-access-28rbj\") pod \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.289114 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-util\") pod \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\" (UID: \"40053995-8fb5-4ef3-a6c1-06570eaa42cb\") " Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.290039 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-bundle" (OuterVolumeSpecName: "bundle") pod "40053995-8fb5-4ef3-a6c1-06570eaa42cb" (UID: "40053995-8fb5-4ef3-a6c1-06570eaa42cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.308973 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40053995-8fb5-4ef3-a6c1-06570eaa42cb-kube-api-access-28rbj" (OuterVolumeSpecName: "kube-api-access-28rbj") pod "40053995-8fb5-4ef3-a6c1-06570eaa42cb" (UID: "40053995-8fb5-4ef3-a6c1-06570eaa42cb"). InnerVolumeSpecName "kube-api-access-28rbj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.391221 5200 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.391689 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28rbj\" (UniqueName: \"kubernetes.io/projected/40053995-8fb5-4ef3-a6c1-06570eaa42cb-kube-api-access-28rbj\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.488823 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-util" (OuterVolumeSpecName: "util") pod "40053995-8fb5-4ef3-a6c1-06570eaa42cb" (UID: "40053995-8fb5-4ef3-a6c1-06570eaa42cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.493956 5200 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40053995-8fb5-4ef3-a6c1-06570eaa42cb-util\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.791172 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" event={"ID":"40053995-8fb5-4ef3-a6c1-06570eaa42cb","Type":"ContainerDied","Data":"80c1e9b7542eda3717be2ca5b01415e4948ed57450ff025a7d71b68832cfa99e"} Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.791241 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80c1e9b7542eda3717be2ca5b01415e4948ed57450ff025a7d71b68832cfa99e" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.791386 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg" Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.796760 5200 generic.go:358] "Generic (PLEG): container finished" podID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerID="7066f91c7971bc48952a376a2c3f9fff82a0ff0d13c799584349861560f80cf5" exitCode=0 Nov 26 13:45:20 crc kubenswrapper[5200]: I1126 13:45:20.796856 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rd67" event={"ID":"aec2f55c-b101-4413-809e-ba41a4a16d96","Type":"ContainerDied","Data":"7066f91c7971bc48952a376a2c3f9fff82a0ff0d13c799584349861560f80cf5"} Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.199303 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.307139 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxkl\" (UniqueName: \"kubernetes.io/projected/aec2f55c-b101-4413-809e-ba41a4a16d96-kube-api-access-kmxkl\") pod \"aec2f55c-b101-4413-809e-ba41a4a16d96\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.307237 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-catalog-content\") pod \"aec2f55c-b101-4413-809e-ba41a4a16d96\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.307310 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-utilities\") pod \"aec2f55c-b101-4413-809e-ba41a4a16d96\" (UID: \"aec2f55c-b101-4413-809e-ba41a4a16d96\") " Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.309087 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-utilities" (OuterVolumeSpecName: "utilities") pod "aec2f55c-b101-4413-809e-ba41a4a16d96" (UID: "aec2f55c-b101-4413-809e-ba41a4a16d96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.317849 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec2f55c-b101-4413-809e-ba41a4a16d96-kube-api-access-kmxkl" (OuterVolumeSpecName: "kube-api-access-kmxkl") pod "aec2f55c-b101-4413-809e-ba41a4a16d96" (UID: "aec2f55c-b101-4413-809e-ba41a4a16d96"). InnerVolumeSpecName "kube-api-access-kmxkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.317874 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aec2f55c-b101-4413-809e-ba41a4a16d96" (UID: "aec2f55c-b101-4413-809e-ba41a4a16d96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.409379 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmxkl\" (UniqueName: \"kubernetes.io/projected/aec2f55c-b101-4413-809e-ba41a4a16d96-kube-api-access-kmxkl\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.409423 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.409433 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aec2f55c-b101-4413-809e-ba41a4a16d96-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.809493 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rd67" event={"ID":"aec2f55c-b101-4413-809e-ba41a4a16d96","Type":"ContainerDied","Data":"2d78d8546200e5d6772469bf27e33ad52f8f686d53c11b8a2edb27f1ea6e7803"} Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.809601 5200 scope.go:117] "RemoveContainer" containerID="7066f91c7971bc48952a376a2c3f9fff82a0ff0d13c799584349861560f80cf5" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.809515 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rd67" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.859938 5200 scope.go:117] "RemoveContainer" containerID="7aa7a3792efb93c413a9b858d528d3d111ec217007c9b2e58e558e91115763f7" Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.860618 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rd67"] Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.872948 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rd67"] Nov 26 13:45:21 crc kubenswrapper[5200]: I1126 13:45:21.888994 5200 scope.go:117] "RemoveContainer" containerID="f51ea28c1a2e2ea2b10b042d1629d552c68752c0d8686e91353b3366d00bde10" Nov 26 13:45:22 crc kubenswrapper[5200]: I1126 13:45:22.774471 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:22 crc kubenswrapper[5200]: I1126 13:45:22.774564 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:22 crc kubenswrapper[5200]: I1126 13:45:22.880295 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:22 crc kubenswrapper[5200]: I1126 13:45:22.951288 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:22 crc kubenswrapper[5200]: I1126 13:45:22.976807 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" path="/var/lib/kubelet/pods/aec2f55c-b101-4413-809e-ba41a4a16d96/volumes" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.004717 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nsdf8"] Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005822 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerName="pull" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005835 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerName="pull" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005850 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerName="extract-utilities" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005855 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerName="extract-utilities" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005875 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e4ba348-184c-436b-87ae-2fddc33c378e" containerName="console" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005881 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4ba348-184c-436b-87ae-2fddc33c378e" containerName="console" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005890 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerName="util" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005895 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerName="util" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005907 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerName="registry-server" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005913 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerName="registry-server" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005922 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerName="extract-content" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005928 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerName="extract-content" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005940 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerName="extract" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.005945 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerName="extract" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.006065 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e4ba348-184c-436b-87ae-2fddc33c378e" containerName="console" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.006091 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="aec2f55c-b101-4413-809e-ba41a4a16d96" containerName="registry-server" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.006101 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="40053995-8fb5-4ef3-a6c1-06570eaa42cb" containerName="extract" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.020484 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.020345 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nsdf8"] Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.151747 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-catalog-content\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.151831 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7sgl\" (UniqueName: \"kubernetes.io/projected/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-kube-api-access-t7sgl\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.151999 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-utilities\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.253897 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-catalog-content\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.253956 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7sgl\" (UniqueName: \"kubernetes.io/projected/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-kube-api-access-t7sgl\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.254169 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-utilities\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.254571 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-catalog-content\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.254663 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-utilities\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.282098 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7sgl\" (UniqueName: \"kubernetes.io/projected/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-kube-api-access-t7sgl\") pod \"certified-operators-nsdf8\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.343733 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.639216 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nsdf8"] Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.847360 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsdf8" event={"ID":"bc03584d-4ca5-4357-b83f-f6bd99ba99ee","Type":"ContainerStarted","Data":"8ffea3044a00d69c87c399b5e8673f93dd00acfebff9c4e6ea4fdda4386e0c95"} Nov 26 13:45:24 crc kubenswrapper[5200]: I1126 13:45:24.847417 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsdf8" event={"ID":"bc03584d-4ca5-4357-b83f-f6bd99ba99ee","Type":"ContainerStarted","Data":"e0b7050741933a4231df9664ae32db1cdbcb74ba417ca9261362ea42ea018381"} Nov 26 13:45:25 crc kubenswrapper[5200]: I1126 13:45:25.875519 5200 generic.go:358] "Generic (PLEG): container finished" podID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerID="8ffea3044a00d69c87c399b5e8673f93dd00acfebff9c4e6ea4fdda4386e0c95" exitCode=0 Nov 26 13:45:25 crc kubenswrapper[5200]: I1126 13:45:25.876298 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsdf8" event={"ID":"bc03584d-4ca5-4357-b83f-f6bd99ba99ee","Type":"ContainerDied","Data":"8ffea3044a00d69c87c399b5e8673f93dd00acfebff9c4e6ea4fdda4386e0c95"} Nov 26 13:45:27 crc kubenswrapper[5200]: I1126 13:45:27.795934 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbfg4"] Nov 26 13:45:27 crc kubenswrapper[5200]: I1126 13:45:27.796966 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kbfg4" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" containerName="registry-server" containerID="cri-o://98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846" gracePeriod=2 Nov 26 13:45:27 crc kubenswrapper[5200]: I1126 13:45:27.891993 5200 generic.go:358] "Generic (PLEG): container finished" podID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerID="7d89797708476ecf5392ac20bfed3017defdd5623fcd19522bbcfa956c17e56d" exitCode=0 Nov 26 13:45:27 crc kubenswrapper[5200]: I1126 13:45:27.892085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsdf8" event={"ID":"bc03584d-4ca5-4357-b83f-f6bd99ba99ee","Type":"ContainerDied","Data":"7d89797708476ecf5392ac20bfed3017defdd5623fcd19522bbcfa956c17e56d"} Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.272961 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.432881 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-utilities\") pod \"d87a82c4-8157-4640-b90a-42827c4232eb\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.433002 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-catalog-content\") pod \"d87a82c4-8157-4640-b90a-42827c4232eb\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.433037 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdzwz\" (UniqueName: \"kubernetes.io/projected/d87a82c4-8157-4640-b90a-42827c4232eb-kube-api-access-sdzwz\") pod \"d87a82c4-8157-4640-b90a-42827c4232eb\" (UID: \"d87a82c4-8157-4640-b90a-42827c4232eb\") " Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.433991 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-utilities" (OuterVolumeSpecName: "utilities") pod "d87a82c4-8157-4640-b90a-42827c4232eb" (UID: "d87a82c4-8157-4640-b90a-42827c4232eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.444983 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d87a82c4-8157-4640-b90a-42827c4232eb-kube-api-access-sdzwz" (OuterVolumeSpecName: "kube-api-access-sdzwz") pod "d87a82c4-8157-4640-b90a-42827c4232eb" (UID: "d87a82c4-8157-4640-b90a-42827c4232eb"). InnerVolumeSpecName "kube-api-access-sdzwz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.468220 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r"] Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.469047 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" containerName="extract-content" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.469129 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" containerName="extract-content" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.469187 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" containerName="registry-server" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.469240 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" containerName="registry-server" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.469304 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" containerName="extract-utilities" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.469356 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" containerName="extract-utilities" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.469528 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" containerName="registry-server" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.489617 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d87a82c4-8157-4640-b90a-42827c4232eb" (UID: "d87a82c4-8157-4640-b90a-42827c4232eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.534220 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.534262 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdzwz\" (UniqueName: \"kubernetes.io/projected/d87a82c4-8157-4640-b90a-42827c4232eb-kube-api-access-sdzwz\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.534275 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d87a82c4-8157-4640-b90a-42827c4232eb-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.579469 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r"] Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.579654 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.583865 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"metallb-system\"/\"openshift-service-ca.crt\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.584038 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"manager-account-dockercfg-nrs64\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.584165 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"metallb-system\"/\"kube-root-ca.crt\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.585827 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-operator-webhook-server-cert\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.587562 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-operator-controller-manager-service-cert\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.736089 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddfe9197-d598-4037-953f-b9b764821e16-webhook-cert\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.736393 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddfe9197-d598-4037-953f-b9b764821e16-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.736496 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nghw\" (UniqueName: \"kubernetes.io/projected/ddfe9197-d598-4037-953f-b9b764821e16-kube-api-access-2nghw\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.829418 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx"] Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.838114 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.838304 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddfe9197-d598-4037-953f-b9b764821e16-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.838367 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nghw\" (UniqueName: \"kubernetes.io/projected/ddfe9197-d598-4037-953f-b9b764821e16-kube-api-access-2nghw\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.838454 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddfe9197-d598-4037-953f-b9b764821e16-webhook-cert\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.842113 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"controller-dockercfg-qtwxx\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.842497 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.842754 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-webhook-cert\"" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.850012 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddfe9197-d598-4037-953f-b9b764821e16-webhook-cert\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.850048 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddfe9197-d598-4037-953f-b9b764821e16-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.856326 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx"] Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.877651 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nghw\" (UniqueName: \"kubernetes.io/projected/ddfe9197-d598-4037-953f-b9b764821e16-kube-api-access-2nghw\") pod \"metallb-operator-controller-manager-5cd4469944-6sw2r\" (UID: \"ddfe9197-d598-4037-953f-b9b764821e16\") " pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.895760 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.904896 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsdf8" event={"ID":"bc03584d-4ca5-4357-b83f-f6bd99ba99ee","Type":"ContainerStarted","Data":"4a8210e6ebf5b0ecd6c35bc57261c799af0195413a454080073a66a9275384ac"} Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.908655 5200 generic.go:358] "Generic (PLEG): container finished" podID="d87a82c4-8157-4640-b90a-42827c4232eb" containerID="98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846" exitCode=0 Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.908804 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbfg4" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.908903 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbfg4" event={"ID":"d87a82c4-8157-4640-b90a-42827c4232eb","Type":"ContainerDied","Data":"98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846"} Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.908956 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbfg4" event={"ID":"d87a82c4-8157-4640-b90a-42827c4232eb","Type":"ContainerDied","Data":"6cd9be5db3b26d2d11ebe20a0571510d511fc17f0e545a9a3784e5c210a81aa1"} Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.908992 5200 scope.go:117] "RemoveContainer" containerID="98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.934077 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nsdf8" podStartSLOduration=4.99889057 podStartE2EDuration="5.934060184s" podCreationTimestamp="2025-11-26 13:45:23 +0000 UTC" firstStartedPulling="2025-11-26 13:45:25.878040731 +0000 UTC m=+894.557242539" lastFinishedPulling="2025-11-26 13:45:26.813210325 +0000 UTC m=+895.492412153" observedRunningTime="2025-11-26 13:45:28.929956893 +0000 UTC m=+897.609158721" watchObservedRunningTime="2025-11-26 13:45:28.934060184 +0000 UTC m=+897.613261992" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.939890 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9858b6a-a71b-465f-8c42-0cc7fd754c24-apiservice-cert\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.940102 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjx7q\" (UniqueName: \"kubernetes.io/projected/d9858b6a-a71b-465f-8c42-0cc7fd754c24-kube-api-access-gjx7q\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:28 crc kubenswrapper[5200]: I1126 13:45:28.940218 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9858b6a-a71b-465f-8c42-0cc7fd754c24-webhook-cert\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.003478 5200 scope.go:117] "RemoveContainer" containerID="a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.009534 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbfg4"] Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.009570 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kbfg4"] Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.042093 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjx7q\" (UniqueName: \"kubernetes.io/projected/d9858b6a-a71b-465f-8c42-0cc7fd754c24-kube-api-access-gjx7q\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.042761 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9858b6a-a71b-465f-8c42-0cc7fd754c24-webhook-cert\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.042808 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9858b6a-a71b-465f-8c42-0cc7fd754c24-apiservice-cert\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.051526 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9858b6a-a71b-465f-8c42-0cc7fd754c24-webhook-cert\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.052531 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9858b6a-a71b-465f-8c42-0cc7fd754c24-apiservice-cert\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.064020 5200 scope.go:117] "RemoveContainer" containerID="093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.065763 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjx7q\" (UniqueName: \"kubernetes.io/projected/d9858b6a-a71b-465f-8c42-0cc7fd754c24-kube-api-access-gjx7q\") pod \"metallb-operator-webhook-server-658b6bb794-ghdtx\" (UID: \"d9858b6a-a71b-465f-8c42-0cc7fd754c24\") " pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.120919 5200 scope.go:117] "RemoveContainer" containerID="98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846" Nov 26 13:45:29 crc kubenswrapper[5200]: E1126 13:45:29.123586 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846\": container with ID starting with 98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846 not found: ID does not exist" containerID="98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.123643 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846"} err="failed to get container status \"98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846\": rpc error: code = NotFound desc = could not find container \"98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846\": container with ID starting with 98ecc323c8a9d7cac09e3cfc6692aabf734416aea384996c6c981247badf3846 not found: ID does not exist" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.123668 5200 scope.go:117] "RemoveContainer" containerID="a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c" Nov 26 13:45:29 crc kubenswrapper[5200]: E1126 13:45:29.124648 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c\": container with ID starting with a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c not found: ID does not exist" containerID="a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.124673 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c"} err="failed to get container status \"a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c\": rpc error: code = NotFound desc = could not find container \"a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c\": container with ID starting with a7e59537e57b82f685cee170c5dd49dd445ccc99739231176b6a17391f31cc0c not found: ID does not exist" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.124716 5200 scope.go:117] "RemoveContainer" containerID="093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc" Nov 26 13:45:29 crc kubenswrapper[5200]: E1126 13:45:29.124934 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc\": container with ID starting with 093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc not found: ID does not exist" containerID="093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.124959 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc"} err="failed to get container status \"093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc\": rpc error: code = NotFound desc = could not find container \"093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc\": container with ID starting with 093262776bfa314d57b55210104531cfb9d2ffd4e587b52cf4e2c63fc7222cbc not found: ID does not exist" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.173636 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.214850 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r"] Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.485421 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx"] Nov 26 13:45:29 crc kubenswrapper[5200]: W1126 13:45:29.488874 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9858b6a_a71b_465f_8c42_0cc7fd754c24.slice/crio-b78fe29403e119cabd45677b0bbdfce461106e7446d80f61b45433b9347c9875 WatchSource:0}: Error finding container b78fe29403e119cabd45677b0bbdfce461106e7446d80f61b45433b9347c9875: Status 404 returned error can't find the container with id b78fe29403e119cabd45677b0bbdfce461106e7446d80f61b45433b9347c9875 Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.917622 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" event={"ID":"ddfe9197-d598-4037-953f-b9b764821e16","Type":"ContainerStarted","Data":"1aace08d9073b7fdf4d8da7ec4f8b4c7c8f99f0f6c632323beb4beb702d34cde"} Nov 26 13:45:29 crc kubenswrapper[5200]: I1126 13:45:29.920885 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" event={"ID":"d9858b6a-a71b-465f-8c42-0cc7fd754c24","Type":"ContainerStarted","Data":"b78fe29403e119cabd45677b0bbdfce461106e7446d80f61b45433b9347c9875"} Nov 26 13:45:30 crc kubenswrapper[5200]: I1126 13:45:30.977238 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d87a82c4-8157-4640-b90a-42827c4232eb" path="/var/lib/kubelet/pods/d87a82c4-8157-4640-b90a-42827c4232eb/volumes" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.345559 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.346480 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.392937 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.400253 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.403339 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.404737 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.416580 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.980227 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:45:34 crc kubenswrapper[5200]: I1126 13:45:34.980748 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" event={"ID":"ddfe9197-d598-4037-953f-b9b764821e16","Type":"ContainerStarted","Data":"6981602f321fb6afe294adab06f330b72298547d782d31286187e03de9b20e52"} Nov 26 13:45:35 crc kubenswrapper[5200]: I1126 13:45:35.007850 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" podStartSLOduration=1.837030345 podStartE2EDuration="7.007828202s" podCreationTimestamp="2025-11-26 13:45:28 +0000 UTC" firstStartedPulling="2025-11-26 13:45:29.199647832 +0000 UTC m=+897.878849650" lastFinishedPulling="2025-11-26 13:45:34.370445689 +0000 UTC m=+903.049647507" observedRunningTime="2025-11-26 13:45:35.002425846 +0000 UTC m=+903.681627664" watchObservedRunningTime="2025-11-26 13:45:35.007828202 +0000 UTC m=+903.687030030" Nov 26 13:45:35 crc kubenswrapper[5200]: I1126 13:45:35.033806 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:37 crc kubenswrapper[5200]: I1126 13:45:37.791449 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nsdf8"] Nov 26 13:45:37 crc kubenswrapper[5200]: I1126 13:45:37.792533 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nsdf8" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerName="registry-server" containerID="cri-o://4a8210e6ebf5b0ecd6c35bc57261c799af0195413a454080073a66a9275384ac" gracePeriod=2 Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.023610 5200 generic.go:358] "Generic (PLEG): container finished" podID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerID="4a8210e6ebf5b0ecd6c35bc57261c799af0195413a454080073a66a9275384ac" exitCode=0 Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.023944 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsdf8" event={"ID":"bc03584d-4ca5-4357-b83f-f6bd99ba99ee","Type":"ContainerDied","Data":"4a8210e6ebf5b0ecd6c35bc57261c799af0195413a454080073a66a9275384ac"} Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.025974 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" event={"ID":"d9858b6a-a71b-465f-8c42-0cc7fd754c24","Type":"ContainerStarted","Data":"9eb51f4149f9358d24f71075f775248cdbc6f94ebbd71d56a17d6366b412d14b"} Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.027167 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.058388 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" podStartSLOduration=2.591063522 podStartE2EDuration="10.058364213s" podCreationTimestamp="2025-11-26 13:45:28 +0000 UTC" firstStartedPulling="2025-11-26 13:45:29.492819948 +0000 UTC m=+898.172021766" lastFinishedPulling="2025-11-26 13:45:36.960120639 +0000 UTC m=+905.639322457" observedRunningTime="2025-11-26 13:45:38.047257523 +0000 UTC m=+906.726459351" watchObservedRunningTime="2025-11-26 13:45:38.058364213 +0000 UTC m=+906.737566051" Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.165732 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.223968 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7sgl\" (UniqueName: \"kubernetes.io/projected/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-kube-api-access-t7sgl\") pod \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.224089 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-catalog-content\") pod \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.224162 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-utilities\") pod \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\" (UID: \"bc03584d-4ca5-4357-b83f-f6bd99ba99ee\") " Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.225944 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-utilities" (OuterVolumeSpecName: "utilities") pod "bc03584d-4ca5-4357-b83f-f6bd99ba99ee" (UID: "bc03584d-4ca5-4357-b83f-f6bd99ba99ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.235106 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-kube-api-access-t7sgl" (OuterVolumeSpecName: "kube-api-access-t7sgl") pod "bc03584d-4ca5-4357-b83f-f6bd99ba99ee" (UID: "bc03584d-4ca5-4357-b83f-f6bd99ba99ee"). InnerVolumeSpecName "kube-api-access-t7sgl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.253044 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc03584d-4ca5-4357-b83f-f6bd99ba99ee" (UID: "bc03584d-4ca5-4357-b83f-f6bd99ba99ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.325807 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.325878 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:38 crc kubenswrapper[5200]: I1126 13:45:38.325905 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t7sgl\" (UniqueName: \"kubernetes.io/projected/bc03584d-4ca5-4357-b83f-f6bd99ba99ee-kube-api-access-t7sgl\") on node \"crc\" DevicePath \"\"" Nov 26 13:45:39 crc kubenswrapper[5200]: I1126 13:45:39.036916 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsdf8" event={"ID":"bc03584d-4ca5-4357-b83f-f6bd99ba99ee","Type":"ContainerDied","Data":"e0b7050741933a4231df9664ae32db1cdbcb74ba417ca9261362ea42ea018381"} Nov 26 13:45:39 crc kubenswrapper[5200]: I1126 13:45:39.037471 5200 scope.go:117] "RemoveContainer" containerID="4a8210e6ebf5b0ecd6c35bc57261c799af0195413a454080073a66a9275384ac" Nov 26 13:45:39 crc kubenswrapper[5200]: I1126 13:45:39.036941 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsdf8" Nov 26 13:45:39 crc kubenswrapper[5200]: I1126 13:45:39.072436 5200 scope.go:117] "RemoveContainer" containerID="7d89797708476ecf5392ac20bfed3017defdd5623fcd19522bbcfa956c17e56d" Nov 26 13:45:39 crc kubenswrapper[5200]: I1126 13:45:39.074638 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nsdf8"] Nov 26 13:45:39 crc kubenswrapper[5200]: I1126 13:45:39.082980 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nsdf8"] Nov 26 13:45:39 crc kubenswrapper[5200]: I1126 13:45:39.093591 5200 scope.go:117] "RemoveContainer" containerID="8ffea3044a00d69c87c399b5e8673f93dd00acfebff9c4e6ea4fdda4386e0c95" Nov 26 13:45:40 crc kubenswrapper[5200]: I1126 13:45:40.982333 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" path="/var/lib/kubelet/pods/bc03584d-4ca5-4357-b83f-f6bd99ba99ee/volumes" Nov 26 13:45:50 crc kubenswrapper[5200]: I1126 13:45:50.050460 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-658b6bb794-ghdtx" Nov 26 13:46:05 crc kubenswrapper[5200]: I1126 13:46:05.989624 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5cd4469944-6sw2r" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.190149 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gzgvm"] Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.190835 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerName="extract-content" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.190853 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerName="extract-content" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.190869 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerName="registry-server" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.190877 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerName="registry-server" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.190900 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerName="extract-utilities" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.190909 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerName="extract-utilities" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.191014 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc03584d-4ca5-4357-b83f-f6bd99ba99ee" containerName="registry-server" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.196888 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.199506 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"speaker-certs-secret\"" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.200064 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-memberlist\"" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.200668 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"speaker-dockercfg-xgnx5\"" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.200736 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"metallb-system\"/\"metallb-excludel2\"" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.244234 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7964959ff7-8q6lh"] Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.260047 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.279483 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"controller-certs-secret\"" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.287171 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7964959ff7-8q6lh"] Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.288008 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.288068 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nw4g\" (UniqueName: \"kubernetes.io/projected/4caee2ec-ed09-47e7-859f-5f30447cd56f-kube-api-access-2nw4g\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.288123 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4caee2ec-ed09-47e7-859f-5f30447cd56f-metallb-excludel2\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.288162 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-metrics-certs\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.389952 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-cert\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.390039 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-metrics-certs\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.390081 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.390190 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nw4g\" (UniqueName: \"kubernetes.io/projected/4caee2ec-ed09-47e7-859f-5f30447cd56f-kube-api-access-2nw4g\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.390305 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvdt\" (UniqueName: \"kubernetes.io/projected/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-kube-api-access-qrvdt\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: E1126 13:46:07.390201 5200 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.390474 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4caee2ec-ed09-47e7-859f-5f30447cd56f-metallb-excludel2\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: E1126 13:46:07.390524 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist podName:4caee2ec-ed09-47e7-859f-5f30447cd56f nodeName:}" failed. No retries permitted until 2025-11-26 13:46:07.890469796 +0000 UTC m=+936.569671614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist") pod "speaker-gzgvm" (UID: "4caee2ec-ed09-47e7-859f-5f30447cd56f") : secret "metallb-memberlist" not found Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.390719 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-metrics-certs\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: E1126 13:46:07.390835 5200 secret.go:189] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Nov 26 13:46:07 crc kubenswrapper[5200]: E1126 13:46:07.390946 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-metrics-certs podName:4caee2ec-ed09-47e7-859f-5f30447cd56f nodeName:}" failed. No retries permitted until 2025-11-26 13:46:07.890925299 +0000 UTC m=+936.570127117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-metrics-certs") pod "speaker-gzgvm" (UID: "4caee2ec-ed09-47e7-859f-5f30447cd56f") : secret "speaker-certs-secret" not found Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.391360 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4caee2ec-ed09-47e7-859f-5f30447cd56f-metallb-excludel2\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.414447 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nw4g\" (UniqueName: \"kubernetes.io/projected/4caee2ec-ed09-47e7-859f-5f30447cd56f-kube-api-access-2nw4g\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.492089 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvdt\" (UniqueName: \"kubernetes.io/projected/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-kube-api-access-qrvdt\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.492269 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-cert\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.492306 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-metrics-certs\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.495331 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-webhook-cert\"" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.500327 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-metrics-certs\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.507332 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-cert\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.509586 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvdt\" (UniqueName: \"kubernetes.io/projected/337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb-kube-api-access-qrvdt\") pod \"controller-7964959ff7-8q6lh\" (UID: \"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb\") " pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.588851 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.905254 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-metrics-certs\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.905910 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:07 crc kubenswrapper[5200]: E1126 13:46:07.906104 5200 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 13:46:07 crc kubenswrapper[5200]: E1126 13:46:07.906257 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist podName:4caee2ec-ed09-47e7-859f-5f30447cd56f nodeName:}" failed. No retries permitted until 2025-11-26 13:46:08.906226848 +0000 UTC m=+937.585428666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist") pod "speaker-gzgvm" (UID: "4caee2ec-ed09-47e7-859f-5f30447cd56f") : secret "metallb-memberlist" not found Nov 26 13:46:07 crc kubenswrapper[5200]: I1126 13:46:07.913192 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-metrics-certs\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:08 crc kubenswrapper[5200]: I1126 13:46:08.126927 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7964959ff7-8q6lh"] Nov 26 13:46:08 crc kubenswrapper[5200]: W1126 13:46:08.135215 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod337bdc2c_c8e3_4584_9f4e_bcd5c8c4ccfb.slice/crio-efd1d15ac46f5858fbf96af7bff5081a3ede54ff114824b1a0a8ec25099ad12f WatchSource:0}: Error finding container efd1d15ac46f5858fbf96af7bff5081a3ede54ff114824b1a0a8ec25099ad12f: Status 404 returned error can't find the container with id efd1d15ac46f5858fbf96af7bff5081a3ede54ff114824b1a0a8ec25099ad12f Nov 26 13:46:08 crc kubenswrapper[5200]: I1126 13:46:08.137847 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 13:46:08 crc kubenswrapper[5200]: I1126 13:46:08.288732 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7964959ff7-8q6lh" event={"ID":"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb","Type":"ContainerStarted","Data":"efd1d15ac46f5858fbf96af7bff5081a3ede54ff114824b1a0a8ec25099ad12f"} Nov 26 13:46:08 crc kubenswrapper[5200]: I1126 13:46:08.922801 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:08 crc kubenswrapper[5200]: I1126 13:46:08.930613 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4caee2ec-ed09-47e7-859f-5f30447cd56f-memberlist\") pod \"speaker-gzgvm\" (UID: \"4caee2ec-ed09-47e7-859f-5f30447cd56f\") " pod="metallb-system/speaker-gzgvm" Nov 26 13:46:09 crc kubenswrapper[5200]: I1126 13:46:09.014443 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gzgvm" Nov 26 13:46:09 crc kubenswrapper[5200]: I1126 13:46:09.300552 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7964959ff7-8q6lh" event={"ID":"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb","Type":"ContainerStarted","Data":"f07feb17f4abc08a70d672ef43b10b9c4bdf6779a8f03dd4097f4358ef057ec4"} Nov 26 13:46:09 crc kubenswrapper[5200]: I1126 13:46:09.301018 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:09 crc kubenswrapper[5200]: I1126 13:46:09.301038 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7964959ff7-8q6lh" event={"ID":"337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb","Type":"ContainerStarted","Data":"16f4c0c9146704ba6bd08b9699befe8f6d661f643bac7354f7fd1be4c44d8474"} Nov 26 13:46:09 crc kubenswrapper[5200]: I1126 13:46:09.303391 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gzgvm" event={"ID":"4caee2ec-ed09-47e7-859f-5f30447cd56f","Type":"ContainerStarted","Data":"423b4cce7804710ef6f7d48f707c5c8d1c93a27f2047f0acc412cb2f3ee05430"} Nov 26 13:46:09 crc kubenswrapper[5200]: I1126 13:46:09.323310 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7964959ff7-8q6lh" podStartSLOduration=2.323285056 podStartE2EDuration="2.323285056s" podCreationTimestamp="2025-11-26 13:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:46:09.319211706 +0000 UTC m=+937.998413554" watchObservedRunningTime="2025-11-26 13:46:09.323285056 +0000 UTC m=+938.002486874" Nov 26 13:46:10 crc kubenswrapper[5200]: I1126 13:46:10.312120 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gzgvm" event={"ID":"4caee2ec-ed09-47e7-859f-5f30447cd56f","Type":"ContainerStarted","Data":"07df5f59522af20fcd0394c5101b5d237620126aed3dca93e4b5459bdca29163"} Nov 26 13:46:10 crc kubenswrapper[5200]: I1126 13:46:10.313882 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gzgvm" event={"ID":"4caee2ec-ed09-47e7-859f-5f30447cd56f","Type":"ContainerStarted","Data":"2a385be0f5de538538848b895ed772cf41b6f34a4ffcf3567042e9ba3ee03035"} Nov 26 13:46:10 crc kubenswrapper[5200]: I1126 13:46:10.333619 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gzgvm" podStartSLOduration=3.333598116 podStartE2EDuration="3.333598116s" podCreationTimestamp="2025-11-26 13:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:46:10.328963771 +0000 UTC m=+939.008165619" watchObservedRunningTime="2025-11-26 13:46:10.333598116 +0000 UTC m=+939.012799934" Nov 26 13:46:11 crc kubenswrapper[5200]: I1126 13:46:11.319642 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="metallb-system/speaker-gzgvm" Nov 26 13:46:20 crc kubenswrapper[5200]: I1126 13:46:20.320339 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7964959ff7-8q6lh" Nov 26 13:46:22 crc kubenswrapper[5200]: I1126 13:46:22.329940 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gzgvm" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.022728 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8"] Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.039395 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8"] Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.039553 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.042979 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.116704 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cl5d\" (UniqueName: \"kubernetes.io/projected/32c2ca85-2020-4991-8521-ed749516482c-kube-api-access-5cl5d\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.116772 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.116830 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.218981 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cl5d\" (UniqueName: \"kubernetes.io/projected/32c2ca85-2020-4991-8521-ed749516482c-kube-api-access-5cl5d\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.219048 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.219107 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.219849 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.219907 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.249036 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cl5d\" (UniqueName: \"kubernetes.io/projected/32c2ca85-2020-4991-8521-ed749516482c-kube-api-access-5cl5d\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.404134 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:24 crc kubenswrapper[5200]: I1126 13:46:24.941589 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8"] Nov 26 13:46:24 crc kubenswrapper[5200]: W1126 13:46:24.951824 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c2ca85_2020_4991_8521_ed749516482c.slice/crio-94bc2b61c79f9cf178d14cc2dbb5fbc205356443600e868e09dafc24cce78051 WatchSource:0}: Error finding container 94bc2b61c79f9cf178d14cc2dbb5fbc205356443600e868e09dafc24cce78051: Status 404 returned error can't find the container with id 94bc2b61c79f9cf178d14cc2dbb5fbc205356443600e868e09dafc24cce78051 Nov 26 13:46:25 crc kubenswrapper[5200]: I1126 13:46:25.454623 5200 generic.go:358] "Generic (PLEG): container finished" podID="32c2ca85-2020-4991-8521-ed749516482c" containerID="328e16a03a3726c3a6283879cdf5f150b057a0b67ce5ef8e8fab549310ba76a6" exitCode=0 Nov 26 13:46:25 crc kubenswrapper[5200]: I1126 13:46:25.455300 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" event={"ID":"32c2ca85-2020-4991-8521-ed749516482c","Type":"ContainerDied","Data":"328e16a03a3726c3a6283879cdf5f150b057a0b67ce5ef8e8fab549310ba76a6"} Nov 26 13:46:25 crc kubenswrapper[5200]: I1126 13:46:25.455358 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" event={"ID":"32c2ca85-2020-4991-8521-ed749516482c","Type":"ContainerStarted","Data":"94bc2b61c79f9cf178d14cc2dbb5fbc205356443600e868e09dafc24cce78051"} Nov 26 13:46:29 crc kubenswrapper[5200]: I1126 13:46:29.504349 5200 generic.go:358] "Generic (PLEG): container finished" podID="32c2ca85-2020-4991-8521-ed749516482c" containerID="1ac310be0110c08c9308e545d53f743e47749750579d05357c0cccb3ca64d840" exitCode=0 Nov 26 13:46:29 crc kubenswrapper[5200]: I1126 13:46:29.504534 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" event={"ID":"32c2ca85-2020-4991-8521-ed749516482c","Type":"ContainerDied","Data":"1ac310be0110c08c9308e545d53f743e47749750579d05357c0cccb3ca64d840"} Nov 26 13:46:30 crc kubenswrapper[5200]: I1126 13:46:30.516233 5200 generic.go:358] "Generic (PLEG): container finished" podID="32c2ca85-2020-4991-8521-ed749516482c" containerID="e5eaa0e0579486a08d29f2223ad2e36eaeae37faa172bdcd2b7cb5fa291d1cb5" exitCode=0 Nov 26 13:46:30 crc kubenswrapper[5200]: I1126 13:46:30.516369 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" event={"ID":"32c2ca85-2020-4991-8521-ed749516482c","Type":"ContainerDied","Data":"e5eaa0e0579486a08d29f2223ad2e36eaeae37faa172bdcd2b7cb5fa291d1cb5"} Nov 26 13:46:31 crc kubenswrapper[5200]: I1126 13:46:31.929056 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:31 crc kubenswrapper[5200]: I1126 13:46:31.964161 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cl5d\" (UniqueName: \"kubernetes.io/projected/32c2ca85-2020-4991-8521-ed749516482c-kube-api-access-5cl5d\") pod \"32c2ca85-2020-4991-8521-ed749516482c\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " Nov 26 13:46:31 crc kubenswrapper[5200]: I1126 13:46:31.964257 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-bundle\") pod \"32c2ca85-2020-4991-8521-ed749516482c\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " Nov 26 13:46:31 crc kubenswrapper[5200]: I1126 13:46:31.964438 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-util\") pod \"32c2ca85-2020-4991-8521-ed749516482c\" (UID: \"32c2ca85-2020-4991-8521-ed749516482c\") " Nov 26 13:46:31 crc kubenswrapper[5200]: I1126 13:46:31.966768 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-bundle" (OuterVolumeSpecName: "bundle") pod "32c2ca85-2020-4991-8521-ed749516482c" (UID: "32c2ca85-2020-4991-8521-ed749516482c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:46:31 crc kubenswrapper[5200]: I1126 13:46:31.976976 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c2ca85-2020-4991-8521-ed749516482c-kube-api-access-5cl5d" (OuterVolumeSpecName: "kube-api-access-5cl5d") pod "32c2ca85-2020-4991-8521-ed749516482c" (UID: "32c2ca85-2020-4991-8521-ed749516482c"). InnerVolumeSpecName "kube-api-access-5cl5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:46:31 crc kubenswrapper[5200]: I1126 13:46:31.987841 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-util" (OuterVolumeSpecName: "util") pod "32c2ca85-2020-4991-8521-ed749516482c" (UID: "32c2ca85-2020-4991-8521-ed749516482c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:46:32 crc kubenswrapper[5200]: I1126 13:46:32.066591 5200 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-util\") on node \"crc\" DevicePath \"\"" Nov 26 13:46:32 crc kubenswrapper[5200]: I1126 13:46:32.067117 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cl5d\" (UniqueName: \"kubernetes.io/projected/32c2ca85-2020-4991-8521-ed749516482c-kube-api-access-5cl5d\") on node \"crc\" DevicePath \"\"" Nov 26 13:46:32 crc kubenswrapper[5200]: I1126 13:46:32.067132 5200 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c2ca85-2020-4991-8521-ed749516482c-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:46:32 crc kubenswrapper[5200]: I1126 13:46:32.543210 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" Nov 26 13:46:32 crc kubenswrapper[5200]: I1126 13:46:32.543260 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8" event={"ID":"32c2ca85-2020-4991-8521-ed749516482c","Type":"ContainerDied","Data":"94bc2b61c79f9cf178d14cc2dbb5fbc205356443600e868e09dafc24cce78051"} Nov 26 13:46:32 crc kubenswrapper[5200]: I1126 13:46:32.543347 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94bc2b61c79f9cf178d14cc2dbb5fbc205356443600e868e09dafc24cce78051" Nov 26 13:46:33 crc kubenswrapper[5200]: I1126 13:46:33.155388 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:46:33 crc kubenswrapper[5200]: I1126 13:46:33.158820 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.312901 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz"] Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.314036 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c2ca85-2020-4991-8521-ed749516482c" containerName="util" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.314054 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c2ca85-2020-4991-8521-ed749516482c" containerName="util" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.314081 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c2ca85-2020-4991-8521-ed749516482c" containerName="extract" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.314088 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c2ca85-2020-4991-8521-ed749516482c" containerName="extract" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.314109 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c2ca85-2020-4991-8521-ed749516482c" containerName="pull" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.314115 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c2ca85-2020-4991-8521-ed749516482c" containerName="pull" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.314238 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="32c2ca85-2020-4991-8521-ed749516482c" containerName="extract" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.325178 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.327822 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-dnvmv\"" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.327989 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.328534 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.332187 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz"] Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.442130 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/081bb9fb-932b-4364-a4aa-a18b775a282d-tmp\") pod \"cert-manager-operator-controller-manager-655ccd8ccb-d4rvz\" (UID: \"081bb9fb-932b-4364-a4aa-a18b775a282d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.442621 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frvxd\" (UniqueName: \"kubernetes.io/projected/081bb9fb-932b-4364-a4aa-a18b775a282d-kube-api-access-frvxd\") pod \"cert-manager-operator-controller-manager-655ccd8ccb-d4rvz\" (UID: \"081bb9fb-932b-4364-a4aa-a18b775a282d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.544302 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frvxd\" (UniqueName: \"kubernetes.io/projected/081bb9fb-932b-4364-a4aa-a18b775a282d-kube-api-access-frvxd\") pod \"cert-manager-operator-controller-manager-655ccd8ccb-d4rvz\" (UID: \"081bb9fb-932b-4364-a4aa-a18b775a282d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.544391 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/081bb9fb-932b-4364-a4aa-a18b775a282d-tmp\") pod \"cert-manager-operator-controller-manager-655ccd8ccb-d4rvz\" (UID: \"081bb9fb-932b-4364-a4aa-a18b775a282d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.545145 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/081bb9fb-932b-4364-a4aa-a18b775a282d-tmp\") pod \"cert-manager-operator-controller-manager-655ccd8ccb-d4rvz\" (UID: \"081bb9fb-932b-4364-a4aa-a18b775a282d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.578345 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frvxd\" (UniqueName: \"kubernetes.io/projected/081bb9fb-932b-4364-a4aa-a18b775a282d-kube-api-access-frvxd\") pod \"cert-manager-operator-controller-manager-655ccd8ccb-d4rvz\" (UID: \"081bb9fb-932b-4364-a4aa-a18b775a282d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" Nov 26 13:46:36 crc kubenswrapper[5200]: I1126 13:46:36.646219 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" Nov 26 13:46:37 crc kubenswrapper[5200]: I1126 13:46:37.154195 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz"] Nov 26 13:46:37 crc kubenswrapper[5200]: W1126 13:46:37.161928 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod081bb9fb_932b_4364_a4aa_a18b775a282d.slice/crio-20335c5992c27ce5def8fc6f14a7dbe909b4d20ae2aa1a732e68c554033d3ac2 WatchSource:0}: Error finding container 20335c5992c27ce5def8fc6f14a7dbe909b4d20ae2aa1a732e68c554033d3ac2: Status 404 returned error can't find the container with id 20335c5992c27ce5def8fc6f14a7dbe909b4d20ae2aa1a732e68c554033d3ac2 Nov 26 13:46:37 crc kubenswrapper[5200]: I1126 13:46:37.587151 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" event={"ID":"081bb9fb-932b-4364-a4aa-a18b775a282d","Type":"ContainerStarted","Data":"20335c5992c27ce5def8fc6f14a7dbe909b4d20ae2aa1a732e68c554033d3ac2"} Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.076040 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-frr-k8s/frr-k8s-whspq"] Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.097245 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.100163 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-frr-k8s\"/\"frr-startup\"" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.100519 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-frr-k8s\"/\"kube-root-ca.crt\"" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.100632 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-frr-k8s\"/\"frr-k8s-certs-secret\"" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.100781 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-frr-k8s\"/\"frr-k8s-daemon-dockercfg-84tvn\"" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.100821 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-frr-k8s\"/\"openshift-service-ca.crt\"" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.100927 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-frr-k8s\"/\"env-overrides\"" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.220511 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-conf\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.220566 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-startup\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.220593 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-sockets\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.220642 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics-certs\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.220698 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-status\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-status\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.220727 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.220759 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-reloader\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.220797 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sp5j\" (UniqueName: \"kubernetes.io/projected/8a76015f-70f1-4d45-ac2c-1d7158278ccc-kube-api-access-5sp5j\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.321903 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sp5j\" (UniqueName: \"kubernetes.io/projected/8a76015f-70f1-4d45-ac2c-1d7158278ccc-kube-api-access-5sp5j\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.322464 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-conf\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.322486 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-startup\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.322957 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-conf\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.323010 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-sockets\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.323899 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-startup\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.323964 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics-certs\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: E1126 13:46:41.324630 5200 secret.go:189] Couldn't get secret openshift-frr-k8s/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Nov 26 13:46:41 crc kubenswrapper[5200]: E1126 13:46:41.324712 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics-certs podName:8a76015f-70f1-4d45-ac2c-1d7158278ccc nodeName:}" failed. No retries permitted until 2025-11-26 13:46:41.824692395 +0000 UTC m=+970.503894213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics-certs") pod "frr-k8s-whspq" (UID: "8a76015f-70f1-4d45-ac2c-1d7158278ccc") : secret "frr-k8s-certs-secret" not found Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.324910 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"frr-status\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-status\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.324938 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.324974 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-reloader\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.325195 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-reloader\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.325393 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"frr-status\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-status\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.325563 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.327585 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a76015f-70f1-4d45-ac2c-1d7158278ccc-frr-sockets\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.344863 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sp5j\" (UniqueName: \"kubernetes.io/projected/8a76015f-70f1-4d45-ac2c-1d7158278ccc-kube-api-access-5sp5j\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.834499 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics-certs\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: I1126 13:46:41.838952 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a76015f-70f1-4d45-ac2c-1d7158278ccc-metrics-certs\") pod \"frr-k8s-whspq\" (UID: \"8a76015f-70f1-4d45-ac2c-1d7158278ccc\") " pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:41 crc kubenswrapper[5200]: E1126 13:46:41.943055 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=740458 actualBytes=10240 Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.015634 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.445901 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9"] Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.460754 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.463023 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-frr-k8s\"/\"frr-k8s-webhook-server-cert\"" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.548361 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw47n\" (UniqueName: \"kubernetes.io/projected/2c74b238-fb7a-440c-a7e0-349b21612bbb-kube-api-access-fw47n\") pod \"frr-k8s-webhook-server-bc5694f79-pmxf9\" (UID: \"2c74b238-fb7a-440c-a7e0-349b21612bbb\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.548427 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c74b238-fb7a-440c-a7e0-349b21612bbb-cert\") pod \"frr-k8s-webhook-server-bc5694f79-pmxf9\" (UID: \"2c74b238-fb7a-440c-a7e0-349b21612bbb\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.650222 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw47n\" (UniqueName: \"kubernetes.io/projected/2c74b238-fb7a-440c-a7e0-349b21612bbb-kube-api-access-fw47n\") pod \"frr-k8s-webhook-server-bc5694f79-pmxf9\" (UID: \"2c74b238-fb7a-440c-a7e0-349b21612bbb\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.650279 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c74b238-fb7a-440c-a7e0-349b21612bbb-cert\") pod \"frr-k8s-webhook-server-bc5694f79-pmxf9\" (UID: \"2c74b238-fb7a-440c-a7e0-349b21612bbb\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.670197 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw47n\" (UniqueName: \"kubernetes.io/projected/2c74b238-fb7a-440c-a7e0-349b21612bbb-kube-api-access-fw47n\") pod \"frr-k8s-webhook-server-bc5694f79-pmxf9\" (UID: \"2c74b238-fb7a-440c-a7e0-349b21612bbb\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.680201 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c74b238-fb7a-440c-a7e0-349b21612bbb-cert\") pod \"frr-k8s-webhook-server-bc5694f79-pmxf9\" (UID: \"2c74b238-fb7a-440c-a7e0-349b21612bbb\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:42 crc kubenswrapper[5200]: I1126 13:46:42.791981 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:45 crc kubenswrapper[5200]: W1126 13:46:45.573212 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c74b238_fb7a_440c_a7e0_349b21612bbb.slice/crio-a5f979bf04c9b0492c3edb03593eabe9be57d2d6f5f19f13377740edbe562241 WatchSource:0}: Error finding container a5f979bf04c9b0492c3edb03593eabe9be57d2d6f5f19f13377740edbe562241: Status 404 returned error can't find the container with id a5f979bf04c9b0492c3edb03593eabe9be57d2d6f5f19f13377740edbe562241 Nov 26 13:46:45 crc kubenswrapper[5200]: I1126 13:46:45.666382 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" event={"ID":"2c74b238-fb7a-440c-a7e0-349b21612bbb","Type":"ContainerStarted","Data":"a5f979bf04c9b0492c3edb03593eabe9be57d2d6f5f19f13377740edbe562241"} Nov 26 13:46:46 crc kubenswrapper[5200]: I1126 13:46:46.677379 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" event={"ID":"081bb9fb-932b-4364-a4aa-a18b775a282d","Type":"ContainerStarted","Data":"20f0b15321cce524f0947a231e624c0034193f6cab0b2ed3be955058df9d9159"} Nov 26 13:46:46 crc kubenswrapper[5200]: I1126 13:46:46.679139 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerStarted","Data":"e8956eaca863428673d583bf765514b7ac64d49ca46674804601eb2d08c3ad01"} Nov 26 13:46:46 crc kubenswrapper[5200]: I1126 13:46:46.707390 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-655ccd8ccb-d4rvz" podStartSLOduration=1.571173485 podStartE2EDuration="10.707361512s" podCreationTimestamp="2025-11-26 13:46:36 +0000 UTC" firstStartedPulling="2025-11-26 13:46:37.16755265 +0000 UTC m=+965.846754468" lastFinishedPulling="2025-11-26 13:46:46.303740677 +0000 UTC m=+974.982942495" observedRunningTime="2025-11-26 13:46:46.702107181 +0000 UTC m=+975.381309069" watchObservedRunningTime="2025-11-26 13:46:46.707361512 +0000 UTC m=+975.386563330" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.637707 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg"] Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.649784 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.652729 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg"] Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.652983 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.653189 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.653603 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-t96j6\"" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.739423 5200 generic.go:358] "Generic (PLEG): container finished" podID="8a76015f-70f1-4d45-ac2c-1d7158278ccc" containerID="f92d1c81d697c2f51afeeafc72c7a5bf4363fdc2b31957eb8c6b94dde423060e" exitCode=0 Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.739589 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerDied","Data":"f92d1c81d697c2f51afeeafc72c7a5bf4363fdc2b31957eb8c6b94dde423060e"} Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.743611 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" event={"ID":"2c74b238-fb7a-440c-a7e0-349b21612bbb","Type":"ContainerStarted","Data":"3e31e2f6bd2820f0dc2ab3b6f8eaa27201aaf4fab8e092ed197fe1ebd75241dd"} Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.744064 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.781187 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbcld\" (UniqueName: \"kubernetes.io/projected/7dff5877-dace-4d99-84dd-7f19da274fce-kube-api-access-kbcld\") pod \"cert-manager-cainjector-7dbf76d5c8-sldvg\" (UID: \"7dff5877-dace-4d99-84dd-7f19da274fce\") " pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.781237 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7dff5877-dace-4d99-84dd-7f19da274fce-bound-sa-token\") pod \"cert-manager-cainjector-7dbf76d5c8-sldvg\" (UID: \"7dff5877-dace-4d99-84dd-7f19da274fce\") " pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.883200 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbcld\" (UniqueName: \"kubernetes.io/projected/7dff5877-dace-4d99-84dd-7f19da274fce-kube-api-access-kbcld\") pod \"cert-manager-cainjector-7dbf76d5c8-sldvg\" (UID: \"7dff5877-dace-4d99-84dd-7f19da274fce\") " pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.883292 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7dff5877-dace-4d99-84dd-7f19da274fce-bound-sa-token\") pod \"cert-manager-cainjector-7dbf76d5c8-sldvg\" (UID: \"7dff5877-dace-4d99-84dd-7f19da274fce\") " pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.911308 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7dff5877-dace-4d99-84dd-7f19da274fce-bound-sa-token\") pod \"cert-manager-cainjector-7dbf76d5c8-sldvg\" (UID: \"7dff5877-dace-4d99-84dd-7f19da274fce\") " pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.914673 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbcld\" (UniqueName: \"kubernetes.io/projected/7dff5877-dace-4d99-84dd-7f19da274fce-kube-api-access-kbcld\") pod \"cert-manager-cainjector-7dbf76d5c8-sldvg\" (UID: \"7dff5877-dace-4d99-84dd-7f19da274fce\") " pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" Nov 26 13:46:53 crc kubenswrapper[5200]: I1126 13:46:53.987651 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" Nov 26 13:46:54 crc kubenswrapper[5200]: I1126 13:46:54.292190 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" podStartSLOduration=5.079816448 podStartE2EDuration="12.292160642s" podCreationTimestamp="2025-11-26 13:46:42 +0000 UTC" firstStartedPulling="2025-11-26 13:46:45.581961529 +0000 UTC m=+974.261163387" lastFinishedPulling="2025-11-26 13:46:52.794305763 +0000 UTC m=+981.473507581" observedRunningTime="2025-11-26 13:46:53.810123627 +0000 UTC m=+982.489325475" watchObservedRunningTime="2025-11-26 13:46:54.292160642 +0000 UTC m=+982.971362480" Nov 26 13:46:54 crc kubenswrapper[5200]: I1126 13:46:54.302094 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg"] Nov 26 13:46:54 crc kubenswrapper[5200]: W1126 13:46:54.313005 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dff5877_dace_4d99_84dd_7f19da274fce.slice/crio-d55e333a72d25a1ec921426268cfb0b0efc24a36c0bc8eb8d5fb4144611f6a20 WatchSource:0}: Error finding container d55e333a72d25a1ec921426268cfb0b0efc24a36c0bc8eb8d5fb4144611f6a20: Status 404 returned error can't find the container with id d55e333a72d25a1ec921426268cfb0b0efc24a36c0bc8eb8d5fb4144611f6a20 Nov 26 13:46:54 crc kubenswrapper[5200]: I1126 13:46:54.756388 5200 generic.go:358] "Generic (PLEG): container finished" podID="8a76015f-70f1-4d45-ac2c-1d7158278ccc" containerID="a0832c807d79bc6deafe19636f0514c6bb61ac7972619d81a46801825f7ac556" exitCode=0 Nov 26 13:46:54 crc kubenswrapper[5200]: I1126 13:46:54.756557 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerDied","Data":"a0832c807d79bc6deafe19636f0514c6bb61ac7972619d81a46801825f7ac556"} Nov 26 13:46:54 crc kubenswrapper[5200]: I1126 13:46:54.760378 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" event={"ID":"7dff5877-dace-4d99-84dd-7f19da274fce","Type":"ContainerStarted","Data":"d55e333a72d25a1ec921426268cfb0b0efc24a36c0bc8eb8d5fb4144611f6a20"} Nov 26 13:46:55 crc kubenswrapper[5200]: I1126 13:46:55.775421 5200 generic.go:358] "Generic (PLEG): container finished" podID="8a76015f-70f1-4d45-ac2c-1d7158278ccc" containerID="5b6471f811092e49417e48643210a29696bcff9f33eea635eb700bb125546ee5" exitCode=0 Nov 26 13:46:55 crc kubenswrapper[5200]: I1126 13:46:55.775475 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerDied","Data":"5b6471f811092e49417e48643210a29696bcff9f33eea635eb700bb125546ee5"} Nov 26 13:46:56 crc kubenswrapper[5200]: I1126 13:46:56.797791 5200 generic.go:358] "Generic (PLEG): container finished" podID="8a76015f-70f1-4d45-ac2c-1d7158278ccc" containerID="8452bafa15eac96e624843cc92e66e9a84f3a00406de0d0826553a562afdbf4f" exitCode=0 Nov 26 13:46:56 crc kubenswrapper[5200]: I1126 13:46:56.797854 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerDied","Data":"8452bafa15eac96e624843cc92e66e9a84f3a00406de0d0826553a562afdbf4f"} Nov 26 13:46:57 crc kubenswrapper[5200]: I1126 13:46:57.817515 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerStarted","Data":"d647dc5407b1f14835e88d92dc17245f1d08ccf93a8f4f25c652595be53750a5"} Nov 26 13:46:57 crc kubenswrapper[5200]: I1126 13:46:57.817949 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerStarted","Data":"372102aee974bbdd104373d2807e16ea7dbf709b0147ee82e5a28e55372334fc"} Nov 26 13:46:57 crc kubenswrapper[5200]: I1126 13:46:57.817965 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerStarted","Data":"d1a8d147feb8188473a3efd4857ca01d94c3e1adcdebd5130a203ee6b8c6c4b8"} Nov 26 13:47:01 crc kubenswrapper[5200]: I1126 13:47:01.445491 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr"] Nov 26 13:47:01 crc kubenswrapper[5200]: I1126 13:47:01.870080 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr"] Nov 26 13:47:01 crc kubenswrapper[5200]: I1126 13:47:01.870312 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:01 crc kubenswrapper[5200]: I1126 13:47:01.873789 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-lrlm5\"" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.047431 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g28ng\" (UniqueName: \"kubernetes.io/projected/ecf047b8-f174-41df-9c97-11cd9496a5ee-kube-api-access-g28ng\") pod \"cert-manager-webhook-7894b5b9b4-4z9qr\" (UID: \"ecf047b8-f174-41df-9c97-11cd9496a5ee\") " pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.047617 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ecf047b8-f174-41df-9c97-11cd9496a5ee-bound-sa-token\") pod \"cert-manager-webhook-7894b5b9b4-4z9qr\" (UID: \"ecf047b8-f174-41df-9c97-11cd9496a5ee\") " pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.149731 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ecf047b8-f174-41df-9c97-11cd9496a5ee-bound-sa-token\") pod \"cert-manager-webhook-7894b5b9b4-4z9qr\" (UID: \"ecf047b8-f174-41df-9c97-11cd9496a5ee\") " pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.149876 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g28ng\" (UniqueName: \"kubernetes.io/projected/ecf047b8-f174-41df-9c97-11cd9496a5ee-kube-api-access-g28ng\") pod \"cert-manager-webhook-7894b5b9b4-4z9qr\" (UID: \"ecf047b8-f174-41df-9c97-11cd9496a5ee\") " pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.172292 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ecf047b8-f174-41df-9c97-11cd9496a5ee-bound-sa-token\") pod \"cert-manager-webhook-7894b5b9b4-4z9qr\" (UID: \"ecf047b8-f174-41df-9c97-11cd9496a5ee\") " pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.175499 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g28ng\" (UniqueName: \"kubernetes.io/projected/ecf047b8-f174-41df-9c97-11cd9496a5ee-kube-api-access-g28ng\") pod \"cert-manager-webhook-7894b5b9b4-4z9qr\" (UID: \"ecf047b8-f174-41df-9c97-11cd9496a5ee\") " pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.228440 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.541561 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr"] Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.876824 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerStarted","Data":"5aaa6080499e8bf0bf7c9dcf760a1517cc370be143d303f00aba90f66e7a6ce2"} Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.876868 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerStarted","Data":"3d5dad0fe5b15f6d4d39681745592d88beb909b34bb632fac56904021164d838"} Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.878730 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" event={"ID":"ecf047b8-f174-41df-9c97-11cd9496a5ee","Type":"ContainerStarted","Data":"a96be0e657598cce88966301c27cfbe1d48eba8be92cd9e474a2fc4b00407f66"} Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.879185 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" event={"ID":"ecf047b8-f174-41df-9c97-11cd9496a5ee","Type":"ContainerStarted","Data":"0c545f4469e61b526a0210588b2abaf9b1233fdb2e6039d8bff8fc369fc6f590"} Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.879214 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.881751 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" event={"ID":"7dff5877-dace-4d99-84dd-7f19da274fce","Type":"ContainerStarted","Data":"bd86aa04e145b498e0271c9c51c937cfb8bf22bdff225cdd09de7ee7d1a3b18b"} Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.899880 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" podStartSLOduration=1.899859788 podStartE2EDuration="1.899859788s" podCreationTimestamp="2025-11-26 13:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:47:02.898996509 +0000 UTC m=+991.578198347" watchObservedRunningTime="2025-11-26 13:47:02.899859788 +0000 UTC m=+991.579061606" Nov 26 13:47:02 crc kubenswrapper[5200]: I1126 13:47:02.919774 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7dbf76d5c8-sldvg" podStartSLOduration=2.018826694 podStartE2EDuration="9.919751088s" podCreationTimestamp="2025-11-26 13:46:53 +0000 UTC" firstStartedPulling="2025-11-26 13:46:54.315808118 +0000 UTC m=+982.995009936" lastFinishedPulling="2025-11-26 13:47:02.216732512 +0000 UTC m=+990.895934330" observedRunningTime="2025-11-26 13:47:02.916733058 +0000 UTC m=+991.595934886" watchObservedRunningTime="2025-11-26 13:47:02.919751088 +0000 UTC m=+991.598952906" Nov 26 13:47:03 crc kubenswrapper[5200]: I1126 13:47:03.154310 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:47:03 crc kubenswrapper[5200]: I1126 13:47:03.154407 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:47:03 crc kubenswrapper[5200]: I1126 13:47:03.901001 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerStarted","Data":"fa4faf063dca29c647b53423361d41ab884e22492d52a54bd9cfd8fad0785bfc"} Nov 26 13:47:03 crc kubenswrapper[5200]: I1126 13:47:03.901624 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-whspq" event={"ID":"8a76015f-70f1-4d45-ac2c-1d7158278ccc","Type":"ContainerStarted","Data":"3b3b170e3b48f51e2be211cf604348187d04962a6747ebf34606d9bc2d1755f6"} Nov 26 13:47:03 crc kubenswrapper[5200]: I1126 13:47:03.956646 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-frr-k8s/frr-k8s-whspq" podStartSLOduration=15.912963697 podStartE2EDuration="22.956620008s" podCreationTimestamp="2025-11-26 13:46:41 +0000 UTC" firstStartedPulling="2025-11-26 13:46:45.74017615 +0000 UTC m=+974.419377978" lastFinishedPulling="2025-11-26 13:46:52.783832471 +0000 UTC m=+981.463034289" observedRunningTime="2025-11-26 13:47:03.953478476 +0000 UTC m=+992.632680324" watchObservedRunningTime="2025-11-26 13:47:03.956620008 +0000 UTC m=+992.635821836" Nov 26 13:47:04 crc kubenswrapper[5200]: I1126 13:47:04.767138 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-pmxf9" Nov 26 13:47:04 crc kubenswrapper[5200]: I1126 13:47:04.915999 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:47:07 crc kubenswrapper[5200]: I1126 13:47:07.016980 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:47:07 crc kubenswrapper[5200]: I1126 13:47:07.022173 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:47:07 crc kubenswrapper[5200]: I1126 13:47:07.060143 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-frr-k8s/frr-k8s-whspq" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.225320 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858d87f86b-jznlj"] Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.453907 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858d87f86b-jznlj"] Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.454167 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858d87f86b-jznlj" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.458030 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-4msjv\"" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.499856 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89f8a963-9f48-4633-aa2a-a3126cd20edd-bound-sa-token\") pod \"cert-manager-858d87f86b-jznlj\" (UID: \"89f8a963-9f48-4633-aa2a-a3126cd20edd\") " pod="cert-manager/cert-manager-858d87f86b-jznlj" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.500259 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qps96\" (UniqueName: \"kubernetes.io/projected/89f8a963-9f48-4633-aa2a-a3126cd20edd-kube-api-access-qps96\") pod \"cert-manager-858d87f86b-jznlj\" (UID: \"89f8a963-9f48-4633-aa2a-a3126cd20edd\") " pod="cert-manager/cert-manager-858d87f86b-jznlj" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.602054 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89f8a963-9f48-4633-aa2a-a3126cd20edd-bound-sa-token\") pod \"cert-manager-858d87f86b-jznlj\" (UID: \"89f8a963-9f48-4633-aa2a-a3126cd20edd\") " pod="cert-manager/cert-manager-858d87f86b-jznlj" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.602204 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qps96\" (UniqueName: \"kubernetes.io/projected/89f8a963-9f48-4633-aa2a-a3126cd20edd-kube-api-access-qps96\") pod \"cert-manager-858d87f86b-jznlj\" (UID: \"89f8a963-9f48-4633-aa2a-a3126cd20edd\") " pod="cert-manager/cert-manager-858d87f86b-jznlj" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.629672 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89f8a963-9f48-4633-aa2a-a3126cd20edd-bound-sa-token\") pod \"cert-manager-858d87f86b-jznlj\" (UID: \"89f8a963-9f48-4633-aa2a-a3126cd20edd\") " pod="cert-manager/cert-manager-858d87f86b-jznlj" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.629783 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qps96\" (UniqueName: \"kubernetes.io/projected/89f8a963-9f48-4633-aa2a-a3126cd20edd-kube-api-access-qps96\") pod \"cert-manager-858d87f86b-jznlj\" (UID: \"89f8a963-9f48-4633-aa2a-a3126cd20edd\") " pod="cert-manager/cert-manager-858d87f86b-jznlj" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.791019 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858d87f86b-jznlj" Nov 26 13:47:09 crc kubenswrapper[5200]: I1126 13:47:09.923249 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-7894b5b9b4-4z9qr" Nov 26 13:47:10 crc kubenswrapper[5200]: I1126 13:47:10.106363 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858d87f86b-jznlj"] Nov 26 13:47:10 crc kubenswrapper[5200]: W1126 13:47:10.114864 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89f8a963_9f48_4633_aa2a_a3126cd20edd.slice/crio-c1f74f8859e1e716ef16694b5bdae89a08fd55bf83a4ebe46eee082a945c7c27 WatchSource:0}: Error finding container c1f74f8859e1e716ef16694b5bdae89a08fd55bf83a4ebe46eee082a945c7c27: Status 404 returned error can't find the container with id c1f74f8859e1e716ef16694b5bdae89a08fd55bf83a4ebe46eee082a945c7c27 Nov 26 13:47:10 crc kubenswrapper[5200]: I1126 13:47:10.994400 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858d87f86b-jznlj" event={"ID":"89f8a963-9f48-4633-aa2a-a3126cd20edd","Type":"ContainerStarted","Data":"655de0f47a813040d5733673c9a0278975f015d098f7d7839ee91d883c40663b"} Nov 26 13:47:10 crc kubenswrapper[5200]: I1126 13:47:10.994469 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858d87f86b-jznlj" event={"ID":"89f8a963-9f48-4633-aa2a-a3126cd20edd","Type":"ContainerStarted","Data":"c1f74f8859e1e716ef16694b5bdae89a08fd55bf83a4ebe46eee082a945c7c27"} Nov 26 13:47:11 crc kubenswrapper[5200]: I1126 13:47:11.017087 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858d87f86b-jznlj" podStartSLOduration=2.017050606 podStartE2EDuration="2.017050606s" podCreationTimestamp="2025-11-26 13:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:47:11.01378246 +0000 UTC m=+999.692984308" watchObservedRunningTime="2025-11-26 13:47:11.017050606 +0000 UTC m=+999.696252464" Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.528564 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rmqm2"] Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.538760 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rmqm2" Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.547392 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack-operators\"/\"kube-root-ca.crt\"" Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.547507 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack-operators\"/\"openshift-service-ca.crt\"" Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.547656 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-operator-index-dockercfg-62v62\"" Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.549064 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rmqm2"] Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.697860 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5ggw\" (UniqueName: \"kubernetes.io/projected/a382e5ec-983b-442a-bc6a-1d50c7b375d1-kube-api-access-m5ggw\") pod \"openstack-operator-index-rmqm2\" (UID: \"a382e5ec-983b-442a-bc6a-1d50c7b375d1\") " pod="openstack-operators/openstack-operator-index-rmqm2" Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.799835 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5ggw\" (UniqueName: \"kubernetes.io/projected/a382e5ec-983b-442a-bc6a-1d50c7b375d1-kube-api-access-m5ggw\") pod \"openstack-operator-index-rmqm2\" (UID: \"a382e5ec-983b-442a-bc6a-1d50c7b375d1\") " pod="openstack-operators/openstack-operator-index-rmqm2" Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.840077 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5ggw\" (UniqueName: \"kubernetes.io/projected/a382e5ec-983b-442a-bc6a-1d50c7b375d1-kube-api-access-m5ggw\") pod \"openstack-operator-index-rmqm2\" (UID: \"a382e5ec-983b-442a-bc6a-1d50c7b375d1\") " pod="openstack-operators/openstack-operator-index-rmqm2" Nov 26 13:47:14 crc kubenswrapper[5200]: I1126 13:47:14.945316 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rmqm2" Nov 26 13:47:15 crc kubenswrapper[5200]: I1126 13:47:15.513612 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rmqm2"] Nov 26 13:47:15 crc kubenswrapper[5200]: W1126 13:47:15.514994 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda382e5ec_983b_442a_bc6a_1d50c7b375d1.slice/crio-9f17efc9be805390edc6efc921c386e8eb86716a58ad587b544980dbdff9afd5 WatchSource:0}: Error finding container 9f17efc9be805390edc6efc921c386e8eb86716a58ad587b544980dbdff9afd5: Status 404 returned error can't find the container with id 9f17efc9be805390edc6efc921c386e8eb86716a58ad587b544980dbdff9afd5 Nov 26 13:47:16 crc kubenswrapper[5200]: I1126 13:47:16.048535 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rmqm2" event={"ID":"a382e5ec-983b-442a-bc6a-1d50c7b375d1","Type":"ContainerStarted","Data":"9f17efc9be805390edc6efc921c386e8eb86716a58ad587b544980dbdff9afd5"} Nov 26 13:47:18 crc kubenswrapper[5200]: I1126 13:47:18.072924 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rmqm2" event={"ID":"a382e5ec-983b-442a-bc6a-1d50c7b375d1","Type":"ContainerStarted","Data":"4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70"} Nov 26 13:47:18 crc kubenswrapper[5200]: I1126 13:47:18.104135 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rmqm2" podStartSLOduration=1.776824747 podStartE2EDuration="4.104104908s" podCreationTimestamp="2025-11-26 13:47:14 +0000 UTC" firstStartedPulling="2025-11-26 13:47:15.518080545 +0000 UTC m=+1004.197282393" lastFinishedPulling="2025-11-26 13:47:17.845360726 +0000 UTC m=+1006.524562554" observedRunningTime="2025-11-26 13:47:18.098056618 +0000 UTC m=+1006.777258476" watchObservedRunningTime="2025-11-26 13:47:18.104104908 +0000 UTC m=+1006.783306766" Nov 26 13:47:18 crc kubenswrapper[5200]: I1126 13:47:18.840509 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rmqm2"] Nov 26 13:47:19 crc kubenswrapper[5200]: I1126 13:47:19.454820 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mmlxp"] Nov 26 13:47:19 crc kubenswrapper[5200]: I1126 13:47:19.462007 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:19 crc kubenswrapper[5200]: I1126 13:47:19.464996 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mmlxp"] Nov 26 13:47:19 crc kubenswrapper[5200]: I1126 13:47:19.586265 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms4sk\" (UniqueName: \"kubernetes.io/projected/6e8aad04-af7e-40c5-be4c-5c30c93c067d-kube-api-access-ms4sk\") pod \"openstack-operator-index-mmlxp\" (UID: \"6e8aad04-af7e-40c5-be4c-5c30c93c067d\") " pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:19 crc kubenswrapper[5200]: I1126 13:47:19.688677 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms4sk\" (UniqueName: \"kubernetes.io/projected/6e8aad04-af7e-40c5-be4c-5c30c93c067d-kube-api-access-ms4sk\") pod \"openstack-operator-index-mmlxp\" (UID: \"6e8aad04-af7e-40c5-be4c-5c30c93c067d\") " pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:19 crc kubenswrapper[5200]: I1126 13:47:19.724891 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms4sk\" (UniqueName: \"kubernetes.io/projected/6e8aad04-af7e-40c5-be4c-5c30c93c067d-kube-api-access-ms4sk\") pod \"openstack-operator-index-mmlxp\" (UID: \"6e8aad04-af7e-40c5-be4c-5c30c93c067d\") " pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:19 crc kubenswrapper[5200]: I1126 13:47:19.784394 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:20 crc kubenswrapper[5200]: I1126 13:47:20.099959 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rmqm2" podUID="a382e5ec-983b-442a-bc6a-1d50c7b375d1" containerName="registry-server" containerID="cri-o://4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70" gracePeriod=2 Nov 26 13:47:20 crc kubenswrapper[5200]: I1126 13:47:20.103666 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mmlxp"] Nov 26 13:47:20 crc kubenswrapper[5200]: W1126 13:47:20.109476 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e8aad04_af7e_40c5_be4c_5c30c93c067d.slice/crio-da9e3f0ec9136626ff3b00c5a1df19cc2b6123597b9a1767130d2fd14505b5f1 WatchSource:0}: Error finding container da9e3f0ec9136626ff3b00c5a1df19cc2b6123597b9a1767130d2fd14505b5f1: Status 404 returned error can't find the container with id da9e3f0ec9136626ff3b00c5a1df19cc2b6123597b9a1767130d2fd14505b5f1 Nov 26 13:47:20 crc kubenswrapper[5200]: I1126 13:47:20.457857 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rmqm2" Nov 26 13:47:20 crc kubenswrapper[5200]: I1126 13:47:20.605854 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5ggw\" (UniqueName: \"kubernetes.io/projected/a382e5ec-983b-442a-bc6a-1d50c7b375d1-kube-api-access-m5ggw\") pod \"a382e5ec-983b-442a-bc6a-1d50c7b375d1\" (UID: \"a382e5ec-983b-442a-bc6a-1d50c7b375d1\") " Nov 26 13:47:20 crc kubenswrapper[5200]: I1126 13:47:20.613982 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a382e5ec-983b-442a-bc6a-1d50c7b375d1-kube-api-access-m5ggw" (OuterVolumeSpecName: "kube-api-access-m5ggw") pod "a382e5ec-983b-442a-bc6a-1d50c7b375d1" (UID: "a382e5ec-983b-442a-bc6a-1d50c7b375d1"). InnerVolumeSpecName "kube-api-access-m5ggw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:47:20 crc kubenswrapper[5200]: I1126 13:47:20.707583 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5ggw\" (UniqueName: \"kubernetes.io/projected/a382e5ec-983b-442a-bc6a-1d50c7b375d1-kube-api-access-m5ggw\") on node \"crc\" DevicePath \"\"" Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.111100 5200 generic.go:358] "Generic (PLEG): container finished" podID="a382e5ec-983b-442a-bc6a-1d50c7b375d1" containerID="4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70" exitCode=0 Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.111225 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rmqm2" Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.111211 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rmqm2" event={"ID":"a382e5ec-983b-442a-bc6a-1d50c7b375d1","Type":"ContainerDied","Data":"4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70"} Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.111504 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rmqm2" event={"ID":"a382e5ec-983b-442a-bc6a-1d50c7b375d1","Type":"ContainerDied","Data":"9f17efc9be805390edc6efc921c386e8eb86716a58ad587b544980dbdff9afd5"} Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.111549 5200 scope.go:117] "RemoveContainer" containerID="4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70" Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.114400 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mmlxp" event={"ID":"6e8aad04-af7e-40c5-be4c-5c30c93c067d","Type":"ContainerStarted","Data":"b9a79ccfc45b84e4c7caaa920d3187cdd1f0eccad22057a4a7ab8f3d150684a4"} Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.114436 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mmlxp" event={"ID":"6e8aad04-af7e-40c5-be4c-5c30c93c067d","Type":"ContainerStarted","Data":"da9e3f0ec9136626ff3b00c5a1df19cc2b6123597b9a1767130d2fd14505b5f1"} Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.141097 5200 scope.go:117] "RemoveContainer" containerID="4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70" Nov 26 13:47:21 crc kubenswrapper[5200]: E1126 13:47:21.141924 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70\": container with ID starting with 4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70 not found: ID does not exist" containerID="4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70" Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.142012 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70"} err="failed to get container status \"4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70\": rpc error: code = NotFound desc = could not find container \"4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70\": container with ID starting with 4509700ef1dc0b7c96f0218f5fde02d4f55c39335d982a98074e5dd81b542b70 not found: ID does not exist" Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.164958 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mmlxp" podStartSLOduration=2.111072986 podStartE2EDuration="2.164927019s" podCreationTimestamp="2025-11-26 13:47:19 +0000 UTC" firstStartedPulling="2025-11-26 13:47:20.11512411 +0000 UTC m=+1008.794325968" lastFinishedPulling="2025-11-26 13:47:20.168978183 +0000 UTC m=+1008.848180001" observedRunningTime="2025-11-26 13:47:21.148031619 +0000 UTC m=+1009.827233497" watchObservedRunningTime="2025-11-26 13:47:21.164927019 +0000 UTC m=+1009.844128917" Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.176605 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rmqm2"] Nov 26 13:47:21 crc kubenswrapper[5200]: I1126 13:47:21.184941 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rmqm2"] Nov 26 13:47:22 crc kubenswrapper[5200]: I1126 13:47:22.985912 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a382e5ec-983b-442a-bc6a-1d50c7b375d1" path="/var/lib/kubelet/pods/a382e5ec-983b-442a-bc6a-1d50c7b375d1/volumes" Nov 26 13:47:29 crc kubenswrapper[5200]: I1126 13:47:29.785536 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:29 crc kubenswrapper[5200]: I1126 13:47:29.786954 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:29 crc kubenswrapper[5200]: I1126 13:47:29.843955 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:30 crc kubenswrapper[5200]: I1126 13:47:30.259920 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mmlxp" Nov 26 13:47:30 crc kubenswrapper[5200]: I1126 13:47:30.982525 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc"] Nov 26 13:47:30 crc kubenswrapper[5200]: I1126 13:47:30.983476 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a382e5ec-983b-442a-bc6a-1d50c7b375d1" containerName="registry-server" Nov 26 13:47:30 crc kubenswrapper[5200]: I1126 13:47:30.983498 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a382e5ec-983b-442a-bc6a-1d50c7b375d1" containerName="registry-server" Nov 26 13:47:30 crc kubenswrapper[5200]: I1126 13:47:30.983671 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a382e5ec-983b-442a-bc6a-1d50c7b375d1" containerName="registry-server" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.000901 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc"] Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.001199 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.004904 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"default-dockercfg-r6w42\"" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.098260 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqqvb\" (UniqueName: \"kubernetes.io/projected/f80ea0f0-fe9e-4342-bd0d-424dee43b603-kube-api-access-xqqvb\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.098318 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-util\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.098581 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-bundle\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.200881 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-bundle\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.201008 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqqvb\" (UniqueName: \"kubernetes.io/projected/f80ea0f0-fe9e-4342-bd0d-424dee43b603-kube-api-access-xqqvb\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.201044 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-util\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.201570 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-bundle\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.201739 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-util\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.242306 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqqvb\" (UniqueName: \"kubernetes.io/projected/f80ea0f0-fe9e-4342-bd0d-424dee43b603-kube-api-access-xqqvb\") pod \"640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.318791 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:31 crc kubenswrapper[5200]: I1126 13:47:31.851706 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc"] Nov 26 13:47:31 crc kubenswrapper[5200]: W1126 13:47:31.867336 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80ea0f0_fe9e_4342_bd0d_424dee43b603.slice/crio-658395e089b3680441b7f279428e07b15b8ec032b77328410321316af5e1fa3c WatchSource:0}: Error finding container 658395e089b3680441b7f279428e07b15b8ec032b77328410321316af5e1fa3c: Status 404 returned error can't find the container with id 658395e089b3680441b7f279428e07b15b8ec032b77328410321316af5e1fa3c Nov 26 13:47:32 crc kubenswrapper[5200]: I1126 13:47:32.225169 5200 generic.go:358] "Generic (PLEG): container finished" podID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerID="4dfbe06f63a173393909ccc264dc43d01bae2c2d3eaf5dd85d5efec202d43d34" exitCode=0 Nov 26 13:47:32 crc kubenswrapper[5200]: I1126 13:47:32.225251 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" event={"ID":"f80ea0f0-fe9e-4342-bd0d-424dee43b603","Type":"ContainerDied","Data":"4dfbe06f63a173393909ccc264dc43d01bae2c2d3eaf5dd85d5efec202d43d34"} Nov 26 13:47:32 crc kubenswrapper[5200]: I1126 13:47:32.225317 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" event={"ID":"f80ea0f0-fe9e-4342-bd0d-424dee43b603","Type":"ContainerStarted","Data":"658395e089b3680441b7f279428e07b15b8ec032b77328410321316af5e1fa3c"} Nov 26 13:47:33 crc kubenswrapper[5200]: I1126 13:47:33.155172 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:47:33 crc kubenswrapper[5200]: I1126 13:47:33.155333 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:47:33 crc kubenswrapper[5200]: I1126 13:47:33.155414 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:47:33 crc kubenswrapper[5200]: I1126 13:47:33.156653 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"499ed0801b3e77071f6b5d1fa3f9f5b7ce1e53c1e78e3fecf97c942228f0d688"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 13:47:33 crc kubenswrapper[5200]: I1126 13:47:33.156793 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://499ed0801b3e77071f6b5d1fa3f9f5b7ce1e53c1e78e3fecf97c942228f0d688" gracePeriod=600 Nov 26 13:47:34 crc kubenswrapper[5200]: I1126 13:47:34.248326 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="499ed0801b3e77071f6b5d1fa3f9f5b7ce1e53c1e78e3fecf97c942228f0d688" exitCode=0 Nov 26 13:47:34 crc kubenswrapper[5200]: I1126 13:47:34.248408 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"499ed0801b3e77071f6b5d1fa3f9f5b7ce1e53c1e78e3fecf97c942228f0d688"} Nov 26 13:47:34 crc kubenswrapper[5200]: I1126 13:47:34.249028 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"9374219ae715a3e96430de37c40ed96a5c875650a0a5602a0ef3886cef886129"} Nov 26 13:47:34 crc kubenswrapper[5200]: I1126 13:47:34.249071 5200 scope.go:117] "RemoveContainer" containerID="4f8422efb2c4a1392904b1f896202fb75031d92c9e97e6bd79cfbb8fc5f04b08" Nov 26 13:47:35 crc kubenswrapper[5200]: I1126 13:47:35.261730 5200 generic.go:358] "Generic (PLEG): container finished" podID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerID="fdc2b99960d6b30cb8fc2f9926637ed10d5674f4fdd185c991d25a0c208faf27" exitCode=0 Nov 26 13:47:35 crc kubenswrapper[5200]: I1126 13:47:35.262058 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" event={"ID":"f80ea0f0-fe9e-4342-bd0d-424dee43b603","Type":"ContainerDied","Data":"fdc2b99960d6b30cb8fc2f9926637ed10d5674f4fdd185c991d25a0c208faf27"} Nov 26 13:47:36 crc kubenswrapper[5200]: I1126 13:47:36.286130 5200 generic.go:358] "Generic (PLEG): container finished" podID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerID="4f8926e87c1b3619c482a43413040f6f0a30cdd4bf8689b9e432f3c415c55bee" exitCode=0 Nov 26 13:47:36 crc kubenswrapper[5200]: I1126 13:47:36.286423 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" event={"ID":"f80ea0f0-fe9e-4342-bd0d-424dee43b603","Type":"ContainerDied","Data":"4f8926e87c1b3619c482a43413040f6f0a30cdd4bf8689b9e432f3c415c55bee"} Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.702043 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.818752 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqqvb\" (UniqueName: \"kubernetes.io/projected/f80ea0f0-fe9e-4342-bd0d-424dee43b603-kube-api-access-xqqvb\") pod \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.818838 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-bundle\") pod \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.818887 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-util\") pod \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\" (UID: \"f80ea0f0-fe9e-4342-bd0d-424dee43b603\") " Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.819675 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-bundle" (OuterVolumeSpecName: "bundle") pod "f80ea0f0-fe9e-4342-bd0d-424dee43b603" (UID: "f80ea0f0-fe9e-4342-bd0d-424dee43b603"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.842297 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-util" (OuterVolumeSpecName: "util") pod "f80ea0f0-fe9e-4342-bd0d-424dee43b603" (UID: "f80ea0f0-fe9e-4342-bd0d-424dee43b603"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.843364 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80ea0f0-fe9e-4342-bd0d-424dee43b603-kube-api-access-xqqvb" (OuterVolumeSpecName: "kube-api-access-xqqvb") pod "f80ea0f0-fe9e-4342-bd0d-424dee43b603" (UID: "f80ea0f0-fe9e-4342-bd0d-424dee43b603"). InnerVolumeSpecName "kube-api-access-xqqvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.920363 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqqvb\" (UniqueName: \"kubernetes.io/projected/f80ea0f0-fe9e-4342-bd0d-424dee43b603-kube-api-access-xqqvb\") on node \"crc\" DevicePath \"\"" Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.920400 5200 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:47:37 crc kubenswrapper[5200]: I1126 13:47:37.920409 5200 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f80ea0f0-fe9e-4342-bd0d-424dee43b603-util\") on node \"crc\" DevicePath \"\"" Nov 26 13:47:38 crc kubenswrapper[5200]: I1126 13:47:38.315251 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" event={"ID":"f80ea0f0-fe9e-4342-bd0d-424dee43b603","Type":"ContainerDied","Data":"658395e089b3680441b7f279428e07b15b8ec032b77328410321316af5e1fa3c"} Nov 26 13:47:38 crc kubenswrapper[5200]: I1126 13:47:38.315337 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658395e089b3680441b7f279428e07b15b8ec032b77328410321316af5e1fa3c" Nov 26 13:47:38 crc kubenswrapper[5200]: I1126 13:47:38.315351 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc" Nov 26 13:47:42 crc kubenswrapper[5200]: E1126 13:47:42.077057 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=772638 actualBytes=10240 Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.438506 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44"] Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.440024 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerName="pull" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.440045 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerName="pull" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.440068 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerName="util" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.440076 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerName="util" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.440105 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerName="extract" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.440113 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerName="extract" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.440278 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f80ea0f0-fe9e-4342-bd0d-424dee43b603" containerName="extract" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.446426 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.450333 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-operator-controller-operator-dockercfg-hgbdp\"" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.481147 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44"] Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.630538 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtbk9\" (UniqueName: \"kubernetes.io/projected/0b996859-4ff2-4e3f-850c-fa191ea64051-kube-api-access-gtbk9\") pod \"openstack-operator-controller-operator-798fcdcd4f-4ht44\" (UID: \"0b996859-4ff2-4e3f-850c-fa191ea64051\") " pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.732130 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtbk9\" (UniqueName: \"kubernetes.io/projected/0b996859-4ff2-4e3f-850c-fa191ea64051-kube-api-access-gtbk9\") pod \"openstack-operator-controller-operator-798fcdcd4f-4ht44\" (UID: \"0b996859-4ff2-4e3f-850c-fa191ea64051\") " pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.758765 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtbk9\" (UniqueName: \"kubernetes.io/projected/0b996859-4ff2-4e3f-850c-fa191ea64051-kube-api-access-gtbk9\") pod \"openstack-operator-controller-operator-798fcdcd4f-4ht44\" (UID: \"0b996859-4ff2-4e3f-850c-fa191ea64051\") " pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" Nov 26 13:47:43 crc kubenswrapper[5200]: I1126 13:47:43.769997 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" Nov 26 13:47:44 crc kubenswrapper[5200]: I1126 13:47:44.277916 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44"] Nov 26 13:47:44 crc kubenswrapper[5200]: I1126 13:47:44.373205 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" event={"ID":"0b996859-4ff2-4e3f-850c-fa191ea64051","Type":"ContainerStarted","Data":"3148640716abd9bde7bec089b6cd42cffb6fdea7bbc512ef67cf052799efd081"} Nov 26 13:47:50 crc kubenswrapper[5200]: I1126 13:47:50.421188 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" event={"ID":"0b996859-4ff2-4e3f-850c-fa191ea64051","Type":"ContainerStarted","Data":"ca616ec3908a30fe3f1bfd3e5d19744cfcc1ab8ebd907407eb940d808d77e2c7"} Nov 26 13:47:50 crc kubenswrapper[5200]: I1126 13:47:50.421943 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" Nov 26 13:47:50 crc kubenswrapper[5200]: I1126 13:47:50.457020 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" podStartSLOduration=2.04920533 podStartE2EDuration="7.45699739s" podCreationTimestamp="2025-11-26 13:47:43 +0000 UTC" firstStartedPulling="2025-11-26 13:47:44.282727787 +0000 UTC m=+1032.961929605" lastFinishedPulling="2025-11-26 13:47:49.690519827 +0000 UTC m=+1038.369721665" observedRunningTime="2025-11-26 13:47:50.451138677 +0000 UTC m=+1039.130340505" watchObservedRunningTime="2025-11-26 13:47:50.45699739 +0000 UTC m=+1039.136199218" Nov 26 13:48:01 crc kubenswrapper[5200]: I1126 13:48:01.433411 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-798fcdcd4f-4ht44" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.740427 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9"] Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.768583 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k"] Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.778222 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.787133 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"barbican-operator-controller-manager-dockercfg-dq7w2\"" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.821029 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.832797 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp"] Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.842133 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"cinder-operator-controller-manager-dockercfg-l4t2s\"" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.850891 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.862855 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxjpx\" (UniqueName: \"kubernetes.io/projected/db4bbd86-c902-4e34-bb73-14f2049c52bc-kube-api-access-fxjpx\") pod \"cinder-operator-controller-manager-79976d8764-8xs2k\" (UID: \"db4bbd86-c902-4e34-bb73-14f2049c52bc\") " pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.862958 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjmqt\" (UniqueName: \"kubernetes.io/projected/a7d2842d-c7b4-4d44-9734-b8378d3346a2-kube-api-access-bjmqt\") pod \"barbican-operator-controller-manager-57ffb97849-48vc9\" (UID: \"a7d2842d-c7b4-4d44-9734-b8378d3346a2\") " pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.864544 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"designate-operator-controller-manager-dockercfg-dfbrw\"" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.871080 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k"] Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.919777 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9"] Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.935833 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp"] Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.944765 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt"] Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.962331 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m"] Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.974398 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxjpx\" (UniqueName: \"kubernetes.io/projected/db4bbd86-c902-4e34-bb73-14f2049c52bc-kube-api-access-fxjpx\") pod \"cinder-operator-controller-manager-79976d8764-8xs2k\" (UID: \"db4bbd86-c902-4e34-bb73-14f2049c52bc\") " pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.974506 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq9mt\" (UniqueName: \"kubernetes.io/projected/784e0ea8-032f-4247-87c8-890fb70038f3-kube-api-access-rq9mt\") pod \"designate-operator-controller-manager-664984dfdb-9llqp\" (UID: \"784e0ea8-032f-4247-87c8-890fb70038f3\") " pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.974565 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjmqt\" (UniqueName: \"kubernetes.io/projected/a7d2842d-c7b4-4d44-9734-b8378d3346a2-kube-api-access-bjmqt\") pod \"barbican-operator-controller-manager-57ffb97849-48vc9\" (UID: \"a7d2842d-c7b4-4d44-9734-b8378d3346a2\") " pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.975211 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.975994 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.984318 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"heat-operator-controller-manager-dockercfg-7sc7q\"" Nov 26 13:48:20 crc kubenswrapper[5200]: I1126 13:48:20.990595 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"glance-operator-controller-manager-dockercfg-s6sxx\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.000267 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.005783 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.032273 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.049069 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjmqt\" (UniqueName: \"kubernetes.io/projected/a7d2842d-c7b4-4d44-9734-b8378d3346a2-kube-api-access-bjmqt\") pod \"barbican-operator-controller-manager-57ffb97849-48vc9\" (UID: \"a7d2842d-c7b4-4d44-9734-b8378d3346a2\") " pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.049091 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.057770 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.062263 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.066799 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"horizon-operator-controller-manager-dockercfg-ghkzk\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.067529 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxjpx\" (UniqueName: \"kubernetes.io/projected/db4bbd86-c902-4e34-bb73-14f2049c52bc-kube-api-access-fxjpx\") pod \"cinder-operator-controller-manager-79976d8764-8xs2k\" (UID: \"db4bbd86-c902-4e34-bb73-14f2049c52bc\") " pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.067947 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.078310 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"infra-operator-controller-manager-dockercfg-nhrpq\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.078622 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"infra-operator-webhook-server-cert\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.079475 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrq5m\" (UniqueName: \"kubernetes.io/projected/7810ba68-8ec1-4268-99ab-196e7d964387-kube-api-access-qrq5m\") pod \"heat-operator-controller-manager-7d5b8c9b49-lsh9m\" (UID: \"7810ba68-8ec1-4268-99ab-196e7d964387\") " pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.079536 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkq6\" (UniqueName: \"kubernetes.io/projected/67d8cd88-e803-4d94-a43e-4e52da2c2baf-kube-api-access-bqkq6\") pod \"glance-operator-controller-manager-77c8d6d47b-mv8dt\" (UID: \"67d8cd88-e803-4d94-a43e-4e52da2c2baf\") " pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.079644 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq9mt\" (UniqueName: \"kubernetes.io/projected/784e0ea8-032f-4247-87c8-890fb70038f3-kube-api-access-rq9mt\") pod \"designate-operator-controller-manager-664984dfdb-9llqp\" (UID: \"784e0ea8-032f-4247-87c8-890fb70038f3\") " pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.088860 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.122574 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.122808 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.132642 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.132862 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.156325 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.158008 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"ironic-operator-controller-manager-dockercfg-xhn46\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.165863 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.171267 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.176542 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"keystone-operator-controller-manager-dockercfg-jvphl\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.184072 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hztkr\" (UniqueName: \"kubernetes.io/projected/58c1e92c-4f66-475a-aa1a-119d8a9aaa88-kube-api-access-hztkr\") pod \"horizon-operator-controller-manager-6dbbbff649-m9lc9\" (UID: \"58c1e92c-4f66-475a-aa1a-119d8a9aaa88\") " pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.184117 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj9fv\" (UniqueName: \"kubernetes.io/projected/e5a61fb8-33a8-4763-84ad-5973feee71a1-kube-api-access-dj9fv\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.184159 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdh8g\" (UniqueName: \"kubernetes.io/projected/01344b64-a822-408a-b687-698b296cc8db-kube-api-access-kdh8g\") pod \"ironic-operator-controller-manager-5bc6b587b-lkn4w\" (UID: \"01344b64-a822-408a-b687-698b296cc8db\") " pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.184211 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrq5m\" (UniqueName: \"kubernetes.io/projected/7810ba68-8ec1-4268-99ab-196e7d964387-kube-api-access-qrq5m\") pod \"heat-operator-controller-manager-7d5b8c9b49-lsh9m\" (UID: \"7810ba68-8ec1-4268-99ab-196e7d964387\") " pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.184244 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.184265 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkq6\" (UniqueName: \"kubernetes.io/projected/67d8cd88-e803-4d94-a43e-4e52da2c2baf-kube-api-access-bqkq6\") pod \"glance-operator-controller-manager-77c8d6d47b-mv8dt\" (UID: \"67d8cd88-e803-4d94-a43e-4e52da2c2baf\") " pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.187605 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq9mt\" (UniqueName: \"kubernetes.io/projected/784e0ea8-032f-4247-87c8-890fb70038f3-kube-api-access-rq9mt\") pod \"designate-operator-controller-manager-664984dfdb-9llqp\" (UID: \"784e0ea8-032f-4247-87c8-890fb70038f3\") " pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.205215 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-696667c874-6stq7"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.234470 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.252861 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.252920 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-696667c874-6stq7"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.253086 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.253613 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.261247 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"mariadb-operator-controller-manager-dockercfg-hl665\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.261660 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.261748 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkq6\" (UniqueName: \"kubernetes.io/projected/67d8cd88-e803-4d94-a43e-4e52da2c2baf-kube-api-access-bqkq6\") pod \"glance-operator-controller-manager-77c8d6d47b-mv8dt\" (UID: \"67d8cd88-e803-4d94-a43e-4e52da2c2baf\") " pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.280415 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"manila-operator-controller-manager-dockercfg-dxws2\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.280740 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrq5m\" (UniqueName: \"kubernetes.io/projected/7810ba68-8ec1-4268-99ab-196e7d964387-kube-api-access-qrq5m\") pod \"heat-operator-controller-manager-7d5b8c9b49-lsh9m\" (UID: \"7810ba68-8ec1-4268-99ab-196e7d964387\") " pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.281186 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.285205 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdh8g\" (UniqueName: \"kubernetes.io/projected/01344b64-a822-408a-b687-698b296cc8db-kube-api-access-kdh8g\") pod \"ironic-operator-controller-manager-5bc6b587b-lkn4w\" (UID: \"01344b64-a822-408a-b687-698b296cc8db\") " pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.285267 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.285297 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4xjp\" (UniqueName: \"kubernetes.io/projected/d27dbce3-5af6-40a8-960c-4f6965dabcad-kube-api-access-j4xjp\") pod \"manila-operator-controller-manager-696667c874-6stq7\" (UID: \"d27dbce3-5af6-40a8-960c-4f6965dabcad\") " pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.285344 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hztkr\" (UniqueName: \"kubernetes.io/projected/58c1e92c-4f66-475a-aa1a-119d8a9aaa88-kube-api-access-hztkr\") pod \"horizon-operator-controller-manager-6dbbbff649-m9lc9\" (UID: \"58c1e92c-4f66-475a-aa1a-119d8a9aaa88\") " pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.285367 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj9fv\" (UniqueName: \"kubernetes.io/projected/e5a61fb8-33a8-4763-84ad-5973feee71a1-kube-api-access-dj9fv\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.285386 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79g5p\" (UniqueName: \"kubernetes.io/projected/4f0ee79e-a3de-4744-86a9-ca2a6b81675c-kube-api-access-79g5p\") pod \"keystone-operator-controller-manager-7dd76969c9-ldrsq\" (UID: \"4f0ee79e-a3de-4744-86a9-ca2a6b81675c\") " pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" Nov 26 13:48:21 crc kubenswrapper[5200]: E1126 13:48:21.285836 5200 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:21 crc kubenswrapper[5200]: E1126 13:48:21.285903 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert podName:e5a61fb8-33a8-4763-84ad-5973feee71a1 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:21.785881007 +0000 UTC m=+1070.465082825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert") pod "infra-operator-controller-manager-9cc5b6bd4-kksm2" (UID: "e5a61fb8-33a8-4763-84ad-5973feee71a1") : secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.326387 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.326426 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.340590 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"neutron-operator-controller-manager-dockercfg-x97zr\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.345147 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj9fv\" (UniqueName: \"kubernetes.io/projected/e5a61fb8-33a8-4763-84ad-5973feee71a1-kube-api-access-dj9fv\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.345981 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdh8g\" (UniqueName: \"kubernetes.io/projected/01344b64-a822-408a-b687-698b296cc8db-kube-api-access-kdh8g\") pod \"ironic-operator-controller-manager-5bc6b587b-lkn4w\" (UID: \"01344b64-a822-408a-b687-698b296cc8db\") " pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.381033 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74f4685-bqfwt"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.394025 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79g5p\" (UniqueName: \"kubernetes.io/projected/4f0ee79e-a3de-4744-86a9-ca2a6b81675c-kube-api-access-79g5p\") pod \"keystone-operator-controller-manager-7dd76969c9-ldrsq\" (UID: \"4f0ee79e-a3de-4744-86a9-ca2a6b81675c\") " pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.394197 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nfz7\" (UniqueName: \"kubernetes.io/projected/3db58e3f-7ff8-42b2-abee-c0b9f7c3d596-kube-api-access-5nfz7\") pod \"mariadb-operator-controller-manager-784656dc9c-9mbvq\" (UID: \"3db58e3f-7ff8-42b2-abee-c0b9f7c3d596\") " pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.394390 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4xjp\" (UniqueName: \"kubernetes.io/projected/d27dbce3-5af6-40a8-960c-4f6965dabcad-kube-api-access-j4xjp\") pod \"manila-operator-controller-manager-696667c874-6stq7\" (UID: \"d27dbce3-5af6-40a8-960c-4f6965dabcad\") " pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.394471 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnprn\" (UniqueName: \"kubernetes.io/projected/19d0e851-53ed-41ce-8730-3c8a1349afd6-kube-api-access-tnprn\") pod \"neutron-operator-controller-manager-6584fc9d7c-nfb9x\" (UID: \"19d0e851-53ed-41ce-8730-3c8a1349afd6\") " pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.411129 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.417888 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.431956 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"nova-operator-controller-manager-dockercfg-6vq9g\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.437133 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hztkr\" (UniqueName: \"kubernetes.io/projected/58c1e92c-4f66-475a-aa1a-119d8a9aaa88-kube-api-access-hztkr\") pod \"horizon-operator-controller-manager-6dbbbff649-m9lc9\" (UID: \"58c1e92c-4f66-475a-aa1a-119d8a9aaa88\") " pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.440626 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79g5p\" (UniqueName: \"kubernetes.io/projected/4f0ee79e-a3de-4744-86a9-ca2a6b81675c-kube-api-access-79g5p\") pod \"keystone-operator-controller-manager-7dd76969c9-ldrsq\" (UID: \"4f0ee79e-a3de-4744-86a9-ca2a6b81675c\") " pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.467375 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4xjp\" (UniqueName: \"kubernetes.io/projected/d27dbce3-5af6-40a8-960c-4f6965dabcad-kube-api-access-j4xjp\") pod \"manila-operator-controller-manager-696667c874-6stq7\" (UID: \"d27dbce3-5af6-40a8-960c-4f6965dabcad\") " pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.470896 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.485006 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.506370 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nfz7\" (UniqueName: \"kubernetes.io/projected/3db58e3f-7ff8-42b2-abee-c0b9f7c3d596-kube-api-access-5nfz7\") pod \"mariadb-operator-controller-manager-784656dc9c-9mbvq\" (UID: \"3db58e3f-7ff8-42b2-abee-c0b9f7c3d596\") " pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.506523 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6l78\" (UniqueName: \"kubernetes.io/projected/e0557c39-59ea-4478-a6e8-ecaed3a311d5-kube-api-access-f6l78\") pod \"nova-operator-controller-manager-74f4685-bqfwt\" (UID: \"e0557c39-59ea-4478-a6e8-ecaed3a311d5\") " pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.506641 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnprn\" (UniqueName: \"kubernetes.io/projected/19d0e851-53ed-41ce-8730-3c8a1349afd6-kube-api-access-tnprn\") pod \"neutron-operator-controller-manager-6584fc9d7c-nfb9x\" (UID: \"19d0e851-53ed-41ce-8730-3c8a1349afd6\") " pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.544855 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nfz7\" (UniqueName: \"kubernetes.io/projected/3db58e3f-7ff8-42b2-abee-c0b9f7c3d596-kube-api-access-5nfz7\") pod \"mariadb-operator-controller-manager-784656dc9c-9mbvq\" (UID: \"3db58e3f-7ff8-42b2-abee-c0b9f7c3d596\") " pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.550210 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnprn\" (UniqueName: \"kubernetes.io/projected/19d0e851-53ed-41ce-8730-3c8a1349afd6-kube-api-access-tnprn\") pod \"neutron-operator-controller-manager-6584fc9d7c-nfb9x\" (UID: \"19d0e851-53ed-41ce-8730-3c8a1349afd6\") " pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.558548 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74f4685-bqfwt"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.570873 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.582008 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.583908 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.584041 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.587055 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"octavia-operator-controller-manager-dockercfg-qpq5d\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.602489 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-698478755b-twvs8"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.604174 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.607835 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6l78\" (UniqueName: \"kubernetes.io/projected/e0557c39-59ea-4478-a6e8-ecaed3a311d5-kube-api-access-f6l78\") pod \"nova-operator-controller-manager-74f4685-bqfwt\" (UID: \"e0557c39-59ea-4478-a6e8-ecaed3a311d5\") " pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.624097 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.631512 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"ovn-operator-controller-manager-dockercfg-cqlh4\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.647376 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6l78\" (UniqueName: \"kubernetes.io/projected/e0557c39-59ea-4478-a6e8-ecaed3a311d5-kube-api-access-f6l78\") pod \"nova-operator-controller-manager-74f4685-bqfwt\" (UID: \"e0557c39-59ea-4478-a6e8-ecaed3a311d5\") " pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.662535 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.671901 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.675195 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-baremetal-operator-webhook-server-cert\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.675533 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.675701 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-baremetal-operator-controller-manager-dockercfg-br4fc\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.688790 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-698478755b-twvs8"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.688975 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.691310 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"placement-operator-controller-manager-dockercfg-dm48t\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.708827 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.712434 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5tg9\" (UniqueName: \"kubernetes.io/projected/8870e7ff-5b48-4a97-a71b-659c3ad81431-kube-api-access-d5tg9\") pod \"ovn-operator-controller-manager-698478755b-twvs8\" (UID: \"8870e7ff-5b48-4a97-a71b-659c3ad81431\") " pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.712488 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjkz\" (UniqueName: \"kubernetes.io/projected/c151e7d7-c8ea-4c75-b933-6ec3d84201fb-kube-api-access-mxjkz\") pod \"octavia-operator-controller-manager-565f56787b-szbjv\" (UID: \"c151e7d7-c8ea-4c75-b933-6ec3d84201fb\") " pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.712559 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lprgq\" (UniqueName: \"kubernetes.io/projected/8a630dbf-7443-4cdd-b250-73108e8abeba-kube-api-access-lprgq\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.712638 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.723865 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.726811 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.738827 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.749315 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.752956 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.757810 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"telemetry-operator-controller-manager-dockercfg-m92sj\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.758192 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.776539 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.789566 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.810010 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.814089 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.814149 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.814213 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5tg9\" (UniqueName: \"kubernetes.io/projected/8870e7ff-5b48-4a97-a71b-659c3ad81431-kube-api-access-d5tg9\") pod \"ovn-operator-controller-manager-698478755b-twvs8\" (UID: \"8870e7ff-5b48-4a97-a71b-659c3ad81431\") " pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.814254 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjkz\" (UniqueName: \"kubernetes.io/projected/c151e7d7-c8ea-4c75-b933-6ec3d84201fb-kube-api-access-mxjkz\") pod \"octavia-operator-controller-manager-565f56787b-szbjv\" (UID: \"c151e7d7-c8ea-4c75-b933-6ec3d84201fb\") " pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.814281 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrcq7\" (UniqueName: \"kubernetes.io/projected/a32a9a76-fc65-489b-9481-73c9f7ff2870-kube-api-access-mrcq7\") pod \"placement-operator-controller-manager-5c76888db5-cfxkz\" (UID: \"a32a9a76-fc65-489b-9481-73c9f7ff2870\") " pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.814343 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qk56\" (UniqueName: \"kubernetes.io/projected/3f88a352-bb4b-49cd-9606-de25d1500c22-kube-api-access-4qk56\") pod \"telemetry-operator-controller-manager-5bdb4cd4b-d8x72\" (UID: \"3f88a352-bb4b-49cd-9606-de25d1500c22\") " pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.814387 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lprgq\" (UniqueName: \"kubernetes.io/projected/8a630dbf-7443-4cdd-b250-73108e8abeba-kube-api-access-lprgq\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:21 crc kubenswrapper[5200]: E1126 13:48:21.814872 5200 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:21 crc kubenswrapper[5200]: E1126 13:48:21.814928 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert podName:e5a61fb8-33a8-4763-84ad-5973feee71a1 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:22.814910393 +0000 UTC m=+1071.494112211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert") pod "infra-operator-controller-manager-9cc5b6bd4-kksm2" (UID: "e5a61fb8-33a8-4763-84ad-5973feee71a1") : secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:21 crc kubenswrapper[5200]: E1126 13:48:21.814978 5200 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:21 crc kubenswrapper[5200]: E1126 13:48:21.814999 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert podName:8a630dbf-7443-4cdd-b250-73108e8abeba nodeName:}" failed. No retries permitted until 2025-11-26 13:48:22.314991065 +0000 UTC m=+1070.994192883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert") pod "openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" (UID: "8a630dbf-7443-4cdd-b250-73108e8abeba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.821324 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55f4974db-8qmm7"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.841790 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.841874 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.844235 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lprgq\" (UniqueName: \"kubernetes.io/projected/8a630dbf-7443-4cdd-b250-73108e8abeba-kube-api-access-lprgq\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.844299 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5tg9\" (UniqueName: \"kubernetes.io/projected/8870e7ff-5b48-4a97-a71b-659c3ad81431-kube-api-access-d5tg9\") pod \"ovn-operator-controller-manager-698478755b-twvs8\" (UID: \"8870e7ff-5b48-4a97-a71b-659c3ad81431\") " pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.848835 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"swift-operator-controller-manager-dockercfg-zx57x\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.854001 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.869853 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"test-operator-controller-manager-dockercfg-xqrv6\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.871304 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjkz\" (UniqueName: \"kubernetes.io/projected/c151e7d7-c8ea-4c75-b933-6ec3d84201fb-kube-api-access-mxjkz\") pod \"octavia-operator-controller-manager-565f56787b-szbjv\" (UID: \"c151e7d7-c8ea-4c75-b933-6ec3d84201fb\") " pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.880574 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.891863 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55f4974db-8qmm7"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.902928 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-89776cb76-99p75"] Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.910595 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.917320 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qk56\" (UniqueName: \"kubernetes.io/projected/3f88a352-bb4b-49cd-9606-de25d1500c22-kube-api-access-4qk56\") pod \"telemetry-operator-controller-manager-5bdb4cd4b-d8x72\" (UID: \"3f88a352-bb4b-49cd-9606-de25d1500c22\") " pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.917397 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcm4b\" (UniqueName: \"kubernetes.io/projected/2276e8f6-42e2-462a-92ea-321b00fb4664-kube-api-access-pcm4b\") pod \"test-operator-controller-manager-55f4974db-8qmm7\" (UID: \"2276e8f6-42e2-462a-92ea-321b00fb4664\") " pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.917554 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrcq7\" (UniqueName: \"kubernetes.io/projected/a32a9a76-fc65-489b-9481-73c9f7ff2870-kube-api-access-mrcq7\") pod \"placement-operator-controller-manager-5c76888db5-cfxkz\" (UID: \"a32a9a76-fc65-489b-9481-73c9f7ff2870\") " pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.917595 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86v99\" (UniqueName: \"kubernetes.io/projected/7e8a26f8-ad1f-4de6-a3f7-063c7635d25d-kube-api-access-86v99\") pod \"swift-operator-controller-manager-7ddc654f5-4nzdn\" (UID: \"7e8a26f8-ad1f-4de6-a3f7-063c7635d25d\") " pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.918009 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"watcher-operator-controller-manager-dockercfg-zckzz\"" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.933107 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.945603 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qk56\" (UniqueName: \"kubernetes.io/projected/3f88a352-bb4b-49cd-9606-de25d1500c22-kube-api-access-4qk56\") pod \"telemetry-operator-controller-manager-5bdb4cd4b-d8x72\" (UID: \"3f88a352-bb4b-49cd-9606-de25d1500c22\") " pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.951959 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrcq7\" (UniqueName: \"kubernetes.io/projected/a32a9a76-fc65-489b-9481-73c9f7ff2870-kube-api-access-mrcq7\") pod \"placement-operator-controller-manager-5c76888db5-cfxkz\" (UID: \"a32a9a76-fc65-489b-9481-73c9f7ff2870\") " pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" Nov 26 13:48:21 crc kubenswrapper[5200]: I1126 13:48:21.959203 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-89776cb76-99p75"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.024281 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcm4b\" (UniqueName: \"kubernetes.io/projected/2276e8f6-42e2-462a-92ea-321b00fb4664-kube-api-access-pcm4b\") pod \"test-operator-controller-manager-55f4974db-8qmm7\" (UID: \"2276e8f6-42e2-462a-92ea-321b00fb4664\") " pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.024545 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm48r\" (UniqueName: \"kubernetes.io/projected/7f7c5b44-fd61-434d-b66f-6050b7dfa67c-kube-api-access-hm48r\") pod \"watcher-operator-controller-manager-89776cb76-99p75\" (UID: \"7f7c5b44-fd61-434d-b66f-6050b7dfa67c\") " pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.024814 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86v99\" (UniqueName: \"kubernetes.io/projected/7e8a26f8-ad1f-4de6-a3f7-063c7635d25d-kube-api-access-86v99\") pod \"swift-operator-controller-manager-7ddc654f5-4nzdn\" (UID: \"7e8a26f8-ad1f-4de6-a3f7-063c7635d25d\") " pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.034455 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.035097 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.046203 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.047290 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.051631 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.053024 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"webhook-server-cert\"" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.053253 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-operator-controller-manager-dockercfg-rxjmv\"" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.053388 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"metrics-server-cert\"" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.063591 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcm4b\" (UniqueName: \"kubernetes.io/projected/2276e8f6-42e2-462a-92ea-321b00fb4664-kube-api-access-pcm4b\") pod \"test-operator-controller-manager-55f4974db-8qmm7\" (UID: \"2276e8f6-42e2-462a-92ea-321b00fb4664\") " pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.063776 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86v99\" (UniqueName: \"kubernetes.io/projected/7e8a26f8-ad1f-4de6-a3f7-063c7635d25d-kube-api-access-86v99\") pod \"swift-operator-controller-manager-7ddc654f5-4nzdn\" (UID: \"7e8a26f8-ad1f-4de6-a3f7-063c7635d25d\") " pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.077215 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.081184 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.087293 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.087469 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.090860 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"rabbitmq-cluster-operator-controller-manager-dockercfg-v8vrc\"" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.127259 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.127444 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.128659 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.128707 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchqv\" (UniqueName: \"kubernetes.io/projected/b4cb36a7-b850-428e-8c2a-ffd54329db04-kube-api-access-xchqv\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.128813 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgz5l\" (UniqueName: \"kubernetes.io/projected/1993ad74-5fa8-4fda-8f96-774585eba2a7-kube-api-access-tgz5l\") pod \"rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9\" (UID: \"1993ad74-5fa8-4fda-8f96-774585eba2a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.128834 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hm48r\" (UniqueName: \"kubernetes.io/projected/7f7c5b44-fd61-434d-b66f-6050b7dfa67c-kube-api-access-hm48r\") pod \"watcher-operator-controller-manager-89776cb76-99p75\" (UID: \"7f7c5b44-fd61-434d-b66f-6050b7dfa67c\") " pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.128896 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.166457 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm48r\" (UniqueName: \"kubernetes.io/projected/7f7c5b44-fd61-434d-b66f-6050b7dfa67c-kube-api-access-hm48r\") pod \"watcher-operator-controller-manager-89776cb76-99p75\" (UID: \"7f7c5b44-fd61-434d-b66f-6050b7dfa67c\") " pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.211894 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.232477 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.232550 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xchqv\" (UniqueName: \"kubernetes.io/projected/b4cb36a7-b850-428e-8c2a-ffd54329db04-kube-api-access-xchqv\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.232654 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgz5l\" (UniqueName: \"kubernetes.io/projected/1993ad74-5fa8-4fda-8f96-774585eba2a7-kube-api-access-tgz5l\") pod \"rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9\" (UID: \"1993ad74-5fa8-4fda-8f96-774585eba2a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.232711 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.233402 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.233440 5200 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.233564 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:22.733524262 +0000 UTC m=+1071.412726080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "webhook-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.233961 5200 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.234071 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:22.734044995 +0000 UTC m=+1071.413246813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "metrics-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.255452 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.264511 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgz5l\" (UniqueName: \"kubernetes.io/projected/1993ad74-5fa8-4fda-8f96-774585eba2a7-kube-api-access-tgz5l\") pod \"rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9\" (UID: \"1993ad74-5fa8-4fda-8f96-774585eba2a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.266179 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchqv\" (UniqueName: \"kubernetes.io/projected/b4cb36a7-b850-428e-8c2a-ffd54329db04-kube-api-access-xchqv\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.334339 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.334671 5200 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.334786 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert podName:8a630dbf-7443-4cdd-b250-73108e8abeba nodeName:}" failed. No retries permitted until 2025-11-26 13:48:23.33476548 +0000 UTC m=+1072.013967298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert") pod "openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" (UID: "8a630dbf-7443-4cdd-b250-73108e8abeba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.365358 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.456554 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.698383 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" event={"ID":"db4bbd86-c902-4e34-bb73-14f2049c52bc","Type":"ContainerStarted","Data":"16748dc768af2ecbc22ee12025ca6d5ec373574168121ba14d9071298ed72f7d"} Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.702496 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" event={"ID":"a7d2842d-c7b4-4d44-9734-b8378d3346a2","Type":"ContainerStarted","Data":"5b95bd0c6bb9dbfe3878d73d1dfec96a79c9033e6678ce578949ee9ca797ad39"} Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.703982 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" event={"ID":"7810ba68-8ec1-4268-99ab-196e7d964387","Type":"ContainerStarted","Data":"157631bb663367460f9878eca7b8822ffb27ca5ade7630ed78cf97bf16cdf87e"} Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.743371 5200 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.743506 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:23.743480931 +0000 UTC m=+1072.422682749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "webhook-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.743199 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.744012 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.744137 5200 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.744203 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:23.744194799 +0000 UTC m=+1072.423396617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "metrics-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.845891 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.846095 5200 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: E1126 13:48:22.846161 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert podName:e5a61fb8-33a8-4763-84ad-5973feee71a1 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:24.846142506 +0000 UTC m=+1073.525344324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert") pod "infra-operator-controller-manager-9cc5b6bd4-kksm2" (UID: "e5a61fb8-33a8-4763-84ad-5973feee71a1") : secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.882502 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.918546 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w"] Nov 26 13:48:22 crc kubenswrapper[5200]: W1126 13:48:22.924891 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01344b64_a822_408a_b687_698b296cc8db.slice/crio-de0d76d7e0be3757c00440dad6ae5e9061d4463bcc55160c10163ab1eee97b58 WatchSource:0}: Error finding container de0d76d7e0be3757c00440dad6ae5e9061d4463bcc55160c10163ab1eee97b58: Status 404 returned error can't find the container with id de0d76d7e0be3757c00440dad6ae5e9061d4463bcc55160c10163ab1eee97b58 Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.946586 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp"] Nov 26 13:48:22 crc kubenswrapper[5200]: I1126 13:48:22.962487 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.198830 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.243491 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74f4685-bqfwt"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.255864 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55f4974db-8qmm7"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.262221 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.269466 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv"] Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.278033 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4qk56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5bdb4cd4b-d8x72_openstack-operators(3f88a352-bb4b-49cd-9606-de25d1500c22): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.282171 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4qk56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5bdb4cd4b-d8x72_openstack-operators(3f88a352-bb4b-49cd-9606-de25d1500c22): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.283389 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" podUID="3f88a352-bb4b-49cd-9606-de25d1500c22" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.283554 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x"] Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.288535 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:89910bc3ecceb7590d3207ac294eb7354de358cf39ef03c72323b26c598e50e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j4xjp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-696667c874-6stq7_openstack-operators(d27dbce3-5af6-40a8-960c-4f6965dabcad): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.291189 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-86v99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7ddc654f5-4nzdn_openstack-operators(7e8a26f8-ad1f-4de6-a3f7-063c7635d25d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.293667 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j4xjp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-696667c874-6stq7_openstack-operators(d27dbce3-5af6-40a8-960c-4f6965dabcad): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.294307 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-698478755b-twvs8"] Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.294455 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-86v99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7ddc654f5-4nzdn_openstack-operators(7e8a26f8-ad1f-4de6-a3f7-063c7635d25d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.294850 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" podUID="d27dbce3-5af6-40a8-960c-4f6965dabcad" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.295977 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" podUID="7e8a26f8-ad1f-4de6-a3f7-063c7635d25d" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.303486 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-696667c874-6stq7"] Nov 26 13:48:23 crc kubenswrapper[5200]: W1126 13:48:23.316517 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d0e851_53ed_41ce_8730_3c8a1349afd6.slice/crio-0525ab4c7c54405c90c853e3d1c28b4b2505d3b762766e3c5ff8a6f92f87493a WatchSource:0}: Error finding container 0525ab4c7c54405c90c853e3d1c28b4b2505d3b762766e3c5ff8a6f92f87493a: Status 404 returned error can't find the container with id 0525ab4c7c54405c90c853e3d1c28b4b2505d3b762766e3c5ff8a6f92f87493a Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.316850 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mrcq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5c76888db5-cfxkz_openstack-operators(a32a9a76-fc65-489b-9481-73c9f7ff2870): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.320584 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mrcq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5c76888db5-cfxkz_openstack-operators(a32a9a76-fc65-489b-9481-73c9f7ff2870): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.321613 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e00a9ed0ab26c5b745bd804ab1fe6b22428d026f17ea05a05f045e060342f46c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tnprn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6584fc9d7c-nfb9x_openstack-operators(19d0e851-53ed-41ce-8730-3c8a1349afd6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.321773 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" podUID="a32a9a76-fc65-489b-9481-73c9f7ff2870" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.323336 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz"] Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.324093 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tnprn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6584fc9d7c-nfb9x_openstack-operators(19d0e851-53ed-41ce-8730-3c8a1349afd6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.326205 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" podUID="19d0e851-53ed-41ce-8730-3c8a1349afd6" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.333727 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.341110 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-89776cb76-99p75"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.348280 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.354622 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn"] Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.367858 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.368086 5200 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.368163 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert podName:8a630dbf-7443-4cdd-b250-73108e8abeba nodeName:}" failed. No retries permitted until 2025-11-26 13:48:25.368143169 +0000 UTC m=+1074.047344987 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert") pod "openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" (UID: "8a630dbf-7443-4cdd-b250-73108e8abeba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.722241 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9" event={"ID":"1993ad74-5fa8-4fda-8f96-774585eba2a7","Type":"ContainerStarted","Data":"8b965d0940dd7f7ec6860aa5a9852b05a38f708c8899e205b9a3c41e9ba26d06"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.724193 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" event={"ID":"3f88a352-bb4b-49cd-9606-de25d1500c22","Type":"ContainerStarted","Data":"04616749ec9010af4f2bb5a9dadd97fbc35ca64ea295ef0ab52950996d10c395"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.729141 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" event={"ID":"e0557c39-59ea-4478-a6e8-ecaed3a311d5","Type":"ContainerStarted","Data":"4113344b9ca3931ea371641390cec7d097a0f3191c8e09e7c0ac1f8f5ead8e8b"} Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.731074 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" podUID="3f88a352-bb4b-49cd-9606-de25d1500c22" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.732946 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" event={"ID":"58c1e92c-4f66-475a-aa1a-119d8a9aaa88","Type":"ContainerStarted","Data":"170c96ee6c4511ac33181850c60b5e8ffc7bfaff6d905230c21409e1e8ad56c0"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.739183 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" event={"ID":"784e0ea8-032f-4247-87c8-890fb70038f3","Type":"ContainerStarted","Data":"e93f215bd6aa06ea49efe05d0536816146a63cd6f65cf4184e576e80c9edfeba"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.756226 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" event={"ID":"c151e7d7-c8ea-4c75-b933-6ec3d84201fb","Type":"ContainerStarted","Data":"fae964b6ef531484fe2d57aea3aec74a01db93cfeb153f982db018db9344ced0"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.757898 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" event={"ID":"19d0e851-53ed-41ce-8730-3c8a1349afd6","Type":"ContainerStarted","Data":"0525ab4c7c54405c90c853e3d1c28b4b2505d3b762766e3c5ff8a6f92f87493a"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.759086 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" event={"ID":"2276e8f6-42e2-462a-92ea-321b00fb4664","Type":"ContainerStarted","Data":"e8869ec5767c55cdd98f3e38f1f9250862767034e2619700cfa6fda7b82a4758"} Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.760969 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e00a9ed0ab26c5b745bd804ab1fe6b22428d026f17ea05a05f045e060342f46c\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" podUID="19d0e851-53ed-41ce-8730-3c8a1349afd6" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.762050 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" event={"ID":"67d8cd88-e803-4d94-a43e-4e52da2c2baf","Type":"ContainerStarted","Data":"89686b97a4625e5003a15d9f59f37cefcb74b4eb994f9b8a34b0fc31a565dd8f"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.764580 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" event={"ID":"3db58e3f-7ff8-42b2-abee-c0b9f7c3d596","Type":"ContainerStarted","Data":"642101d27c56e146965718d83b89ee9bcc04e83775a1ecbe1efe29cbdd5a2749"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.766510 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" event={"ID":"d27dbce3-5af6-40a8-960c-4f6965dabcad","Type":"ContainerStarted","Data":"9314ad573a26f01f8419328a71fe4fd45fa377e0ac21e788e56c63bb83cefe1f"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.768939 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" event={"ID":"4f0ee79e-a3de-4744-86a9-ca2a6b81675c","Type":"ContainerStarted","Data":"0a0c9671ceeb459bec0e4ca7a6c8738f4085f7b2696bcc791167b791a65887bd"} Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.771415 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:89910bc3ecceb7590d3207ac294eb7354de358cf39ef03c72323b26c598e50e6\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" podUID="d27dbce3-5af6-40a8-960c-4f6965dabcad" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.774115 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" event={"ID":"7e8a26f8-ad1f-4de6-a3f7-063c7635d25d","Type":"ContainerStarted","Data":"2999f374c65c14b547266e4137ed79fa875db644cf73a3c9eb7d0122009b2bcf"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.775913 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.776125 5200 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.777813 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:25.777767763 +0000 UTC m=+1074.456969781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "webhook-server-cert" not found Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.776339 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.777904 5200 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.777972 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:25.777949278 +0000 UTC m=+1074.457151096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "metrics-server-cert" not found Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.778302 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" podUID="7e8a26f8-ad1f-4de6-a3f7-063c7635d25d" Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.792878 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" event={"ID":"a32a9a76-fc65-489b-9481-73c9f7ff2870","Type":"ContainerStarted","Data":"652b554216342f82c6fd6789bd8b374aca849c4a3d7f5cac47ec5cc11cfbfac8"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.797568 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" event={"ID":"7f7c5b44-fd61-434d-b66f-6050b7dfa67c","Type":"ContainerStarted","Data":"1c12005360c207a9c75f841f2be250f97dccf908c6f8b4cf193d74f098f96914"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.799468 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" event={"ID":"8870e7ff-5b48-4a97-a71b-659c3ad81431","Type":"ContainerStarted","Data":"564178c7394d9d882ade311d656b5ec2b9040a69eb7186b0fb33a3db6620621c"} Nov 26 13:48:23 crc kubenswrapper[5200]: I1126 13:48:23.800253 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" event={"ID":"01344b64-a822-408a-b687-698b296cc8db","Type":"ContainerStarted","Data":"de0d76d7e0be3757c00440dad6ae5e9061d4463bcc55160c10163ab1eee97b58"} Nov 26 13:48:23 crc kubenswrapper[5200]: E1126 13:48:23.807456 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" podUID="a32a9a76-fc65-489b-9481-73c9f7ff2870" Nov 26 13:48:24 crc kubenswrapper[5200]: E1126 13:48:24.817408 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" podUID="a32a9a76-fc65-489b-9481-73c9f7ff2870" Nov 26 13:48:24 crc kubenswrapper[5200]: E1126 13:48:24.817905 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:89910bc3ecceb7590d3207ac294eb7354de358cf39ef03c72323b26c598e50e6\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" podUID="d27dbce3-5af6-40a8-960c-4f6965dabcad" Nov 26 13:48:24 crc kubenswrapper[5200]: E1126 13:48:24.818121 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" podUID="7e8a26f8-ad1f-4de6-a3f7-063c7635d25d" Nov 26 13:48:24 crc kubenswrapper[5200]: E1126 13:48:24.820005 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e00a9ed0ab26c5b745bd804ab1fe6b22428d026f17ea05a05f045e060342f46c\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" podUID="19d0e851-53ed-41ce-8730-3c8a1349afd6" Nov 26 13:48:24 crc kubenswrapper[5200]: E1126 13:48:24.827578 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" podUID="3f88a352-bb4b-49cd-9606-de25d1500c22" Nov 26 13:48:24 crc kubenswrapper[5200]: I1126 13:48:24.901984 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:24 crc kubenswrapper[5200]: E1126 13:48:24.902923 5200 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:24 crc kubenswrapper[5200]: E1126 13:48:24.903009 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert podName:e5a61fb8-33a8-4763-84ad-5973feee71a1 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:28.902987775 +0000 UTC m=+1077.582189593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert") pod "infra-operator-controller-manager-9cc5b6bd4-kksm2" (UID: "e5a61fb8-33a8-4763-84ad-5973feee71a1") : secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:25 crc kubenswrapper[5200]: I1126 13:48:25.418623 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:25 crc kubenswrapper[5200]: E1126 13:48:25.418805 5200 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:25 crc kubenswrapper[5200]: E1126 13:48:25.418883 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert podName:8a630dbf-7443-4cdd-b250-73108e8abeba nodeName:}" failed. No retries permitted until 2025-11-26 13:48:29.418863208 +0000 UTC m=+1078.098065026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert") pod "openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" (UID: "8a630dbf-7443-4cdd-b250-73108e8abeba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:25 crc kubenswrapper[5200]: I1126 13:48:25.826184 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:25 crc kubenswrapper[5200]: I1126 13:48:25.826262 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:25 crc kubenswrapper[5200]: E1126 13:48:25.826454 5200 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 13:48:25 crc kubenswrapper[5200]: E1126 13:48:25.826520 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:29.82649973 +0000 UTC m=+1078.505701548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "metrics-server-cert" not found Nov 26 13:48:25 crc kubenswrapper[5200]: E1126 13:48:25.826921 5200 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 13:48:25 crc kubenswrapper[5200]: E1126 13:48:25.826950 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:29.826942062 +0000 UTC m=+1078.506143880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "webhook-server-cert" not found Nov 26 13:48:28 crc kubenswrapper[5200]: I1126 13:48:28.921507 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:28 crc kubenswrapper[5200]: E1126 13:48:28.921704 5200 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:28 crc kubenswrapper[5200]: E1126 13:48:28.922545 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert podName:e5a61fb8-33a8-4763-84ad-5973feee71a1 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:36.922521588 +0000 UTC m=+1085.601723406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert") pod "infra-operator-controller-manager-9cc5b6bd4-kksm2" (UID: "e5a61fb8-33a8-4763-84ad-5973feee71a1") : secret "infra-operator-webhook-server-cert" not found Nov 26 13:48:29 crc kubenswrapper[5200]: I1126 13:48:29.432238 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:29 crc kubenswrapper[5200]: E1126 13:48:29.432749 5200 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:29 crc kubenswrapper[5200]: E1126 13:48:29.432885 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert podName:8a630dbf-7443-4cdd-b250-73108e8abeba nodeName:}" failed. No retries permitted until 2025-11-26 13:48:37.432849607 +0000 UTC m=+1086.112051465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert") pod "openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" (UID: "8a630dbf-7443-4cdd-b250-73108e8abeba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 13:48:29 crc kubenswrapper[5200]: I1126 13:48:29.839845 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:29 crc kubenswrapper[5200]: I1126 13:48:29.839932 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:29 crc kubenswrapper[5200]: E1126 13:48:29.840015 5200 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 13:48:29 crc kubenswrapper[5200]: E1126 13:48:29.840023 5200 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 13:48:29 crc kubenswrapper[5200]: E1126 13:48:29.840109 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:37.840089119 +0000 UTC m=+1086.519290937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "webhook-server-cert" not found Nov 26 13:48:29 crc kubenswrapper[5200]: E1126 13:48:29.840138 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs podName:b4cb36a7-b850-428e-8c2a-ffd54329db04 nodeName:}" failed. No retries permitted until 2025-11-26 13:48:37.84012802 +0000 UTC m=+1086.519329838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs") pod "openstack-operator-controller-manager-68d9dc8f85-vzjn8" (UID: "b4cb36a7-b850-428e-8c2a-ffd54329db04") : secret "metrics-server-cert" not found Nov 26 13:48:36 crc kubenswrapper[5200]: I1126 13:48:36.979506 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:36 crc kubenswrapper[5200]: I1126 13:48:36.990817 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a61fb8-33a8-4763-84ad-5973feee71a1-cert\") pod \"infra-operator-controller-manager-9cc5b6bd4-kksm2\" (UID: \"e5a61fb8-33a8-4763-84ad-5973feee71a1\") " pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.150644 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.491246 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.499918 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a630dbf-7443-4cdd-b250-73108e8abeba-cert\") pod \"openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv\" (UID: \"8a630dbf-7443-4cdd-b250-73108e8abeba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.654349 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.911328 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.911430 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.932720 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-metrics-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.936807 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4cb36a7-b850-428e-8c2a-ffd54329db04-webhook-certs\") pod \"openstack-operator-controller-manager-68d9dc8f85-vzjn8\" (UID: \"b4cb36a7-b850-428e-8c2a-ffd54329db04\") " pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.950920 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.967407 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2"] Nov 26 13:48:37 crc kubenswrapper[5200]: I1126 13:48:37.975312 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" event={"ID":"67d8cd88-e803-4d94-a43e-4e52da2c2baf","Type":"ContainerStarted","Data":"e62a3e16b87cfb748ada83fa4b646dc12607ff12cf57cd205fd9fe1dc824c354"} Nov 26 13:48:38 crc kubenswrapper[5200]: I1126 13:48:38.013721 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" event={"ID":"8870e7ff-5b48-4a97-a71b-659c3ad81431","Type":"ContainerStarted","Data":"8d7d27a88d16692cf6739cd669caf6087598f54bc56be67c2aef501e27cd3c3e"} Nov 26 13:48:38 crc kubenswrapper[5200]: I1126 13:48:38.025533 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9" event={"ID":"1993ad74-5fa8-4fda-8f96-774585eba2a7","Type":"ContainerStarted","Data":"f4f1e07c67ae932f59683cc21568f3c0d817c27d1a738060ba2aff59f32c7e74"} Nov 26 13:48:38 crc kubenswrapper[5200]: I1126 13:48:38.043418 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" event={"ID":"7810ba68-8ec1-4268-99ab-196e7d964387","Type":"ContainerStarted","Data":"02df556f4000a2eace64d8c866a0db5d4956c72833bb9f2a6903263c7c1d57de"} Nov 26 13:48:38 crc kubenswrapper[5200]: I1126 13:48:38.043905 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" event={"ID":"58c1e92c-4f66-475a-aa1a-119d8a9aaa88","Type":"ContainerStarted","Data":"9c04e2183459b5228a607b949138094e5d591b0c2c0007fce71227af109b0d16"} Nov 26 13:48:38 crc kubenswrapper[5200]: I1126 13:48:38.051134 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9" podStartSLOduration=3.026403242 podStartE2EDuration="17.051110227s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.272970559 +0000 UTC m=+1071.952172377" lastFinishedPulling="2025-11-26 13:48:37.297677534 +0000 UTC m=+1085.976879362" observedRunningTime="2025-11-26 13:48:38.043816137 +0000 UTC m=+1086.723017955" watchObservedRunningTime="2025-11-26 13:48:38.051110227 +0000 UTC m=+1086.730312045" Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.155465 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fxjpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-79976d8764-8xs2k_openstack-operators(db4bbd86-c902-4e34-bb73-14f2049c52bc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.156710 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" podUID="db4bbd86-c902-4e34-bb73-14f2049c52bc" Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.192357 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bjmqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-57ffb97849-48vc9_openstack-operators(a7d2842d-c7b4-4d44-9734-b8378d3346a2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.193667 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" podUID="a7d2842d-c7b4-4d44-9734-b8378d3346a2" Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.196130 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hm48r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-89776cb76-99p75_openstack-operators(7f7c5b44-fd61-434d-b66f-6050b7dfa67c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.198083 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" podUID="7f7c5b44-fd61-434d-b66f-6050b7dfa67c" Nov 26 13:48:38 crc kubenswrapper[5200]: I1126 13:48:38.230069 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv"] Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.233329 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5nfz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-784656dc9c-9mbvq_openstack-operators(3db58e3f-7ff8-42b2-abee-c0b9f7c3d596): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.235433 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" podUID="3db58e3f-7ff8-42b2-abee-c0b9f7c3d596" Nov 26 13:48:38 crc kubenswrapper[5200]: W1126 13:48:38.250522 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a630dbf_7443_4cdd_b250_73108e8abeba.slice/crio-2bc8ce82dfa7af111fe6e1207026839075f31184b72596f61def847a42852c19 WatchSource:0}: Error finding container 2bc8ce82dfa7af111fe6e1207026839075f31184b72596f61def847a42852c19: Status 404 returned error can't find the container with id 2bc8ce82dfa7af111fe6e1207026839075f31184b72596f61def847a42852c19 Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.488403 5200 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-79g5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7dd76969c9-ldrsq_openstack-operators(4f0ee79e-a3de-4744-86a9-ca2a6b81675c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 13:48:38 crc kubenswrapper[5200]: E1126 13:48:38.490016 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" podUID="4f0ee79e-a3de-4744-86a9-ca2a6b81675c" Nov 26 13:48:38 crc kubenswrapper[5200]: I1126 13:48:38.901513 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8"] Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.147805 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" event={"ID":"db4bbd86-c902-4e34-bb73-14f2049c52bc","Type":"ContainerStarted","Data":"859da82896feee4963a510d2d358ed7c90eba22d69dce0ee6a1d298ae59760c2"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.147880 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" Nov 26 13:48:39 crc kubenswrapper[5200]: E1126 13:48:39.167007 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" podUID="db4bbd86-c902-4e34-bb73-14f2049c52bc" Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.172694 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" event={"ID":"b4cb36a7-b850-428e-8c2a-ffd54329db04","Type":"ContainerStarted","Data":"ef5bc01bf052db2b330de01fd4ed2ce21c0acfe25c720cfe5635acfee6134745"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.185535 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" event={"ID":"7f7c5b44-fd61-434d-b66f-6050b7dfa67c","Type":"ContainerStarted","Data":"367b335651769a4f9fcecb3c3ece8ea5f9ddf9425c4827400ed68e052e8e6767"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.185725 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" Nov 26 13:48:39 crc kubenswrapper[5200]: E1126 13:48:39.192253 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" podUID="7f7c5b44-fd61-434d-b66f-6050b7dfa67c" Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.192874 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" event={"ID":"01344b64-a822-408a-b687-698b296cc8db","Type":"ContainerStarted","Data":"bcf6f13eb675018d34ace647be9da9ff10ce11136d7c668cdaebad4102194b84"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.208916 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" event={"ID":"e0557c39-59ea-4478-a6e8-ecaed3a311d5","Type":"ContainerStarted","Data":"0389d980b5eefaa572de2e4a7b8fbc4d7e6d037661a8503da0f3d9eac9763af2"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.230916 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" event={"ID":"784e0ea8-032f-4247-87c8-890fb70038f3","Type":"ContainerStarted","Data":"6fed03724c00bdb603faa006dd0e702cd822a56845b621cc8a5cc628ea8d8e03"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.243356 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" event={"ID":"a7d2842d-c7b4-4d44-9734-b8378d3346a2","Type":"ContainerStarted","Data":"8e6a9eab4d2f949663f0c11fe17b021fab5f451dd3e5ab7228e7b10fd4c37be6"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.244292 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" Nov 26 13:48:39 crc kubenswrapper[5200]: E1126 13:48:39.249939 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" podUID="a7d2842d-c7b4-4d44-9734-b8378d3346a2" Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.267365 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" event={"ID":"c151e7d7-c8ea-4c75-b933-6ec3d84201fb","Type":"ContainerStarted","Data":"ab7dcd68095af9f000f2b4906f3641ccef33fe353a0e4071859b418b8fd1e4f8"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.285826 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" event={"ID":"2276e8f6-42e2-462a-92ea-321b00fb4664","Type":"ContainerStarted","Data":"be216d3dce5759a86b5ff66ca068483833704c0ac682f1d24cc3586207cb0426"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.302906 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" event={"ID":"3db58e3f-7ff8-42b2-abee-c0b9f7c3d596","Type":"ContainerStarted","Data":"b0b51ff60e14e05b3d1ef48006980d3fcd2a48fb876c964ce9f0cb4113d98f47"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.303398 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" Nov 26 13:48:39 crc kubenswrapper[5200]: E1126 13:48:39.312157 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" podUID="3db58e3f-7ff8-42b2-abee-c0b9f7c3d596" Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.322764 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" event={"ID":"4f0ee79e-a3de-4744-86a9-ca2a6b81675c","Type":"ContainerStarted","Data":"45c4c35dcdaa10ad55f794fddf98902a2b9e55a6a40f1f5c7e93c13b79e9cb31"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.323647 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" Nov 26 13:48:39 crc kubenswrapper[5200]: E1126 13:48:39.328948 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" podUID="4f0ee79e-a3de-4744-86a9-ca2a6b81675c" Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.353589 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" event={"ID":"8a630dbf-7443-4cdd-b250-73108e8abeba","Type":"ContainerStarted","Data":"2bc8ce82dfa7af111fe6e1207026839075f31184b72596f61def847a42852c19"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.355824 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" event={"ID":"e5a61fb8-33a8-4763-84ad-5973feee71a1","Type":"ContainerStarted","Data":"8d5d7559f24ac93943a18ef27e56fe0dd1ef93ec48c00eee469eee0ddfbfa260"} Nov 26 13:48:39 crc kubenswrapper[5200]: I1126 13:48:39.368066 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" event={"ID":"7e8a26f8-ad1f-4de6-a3f7-063c7635d25d","Type":"ContainerStarted","Data":"4945e977784533d51bfadb7b5785734132dbc436595045fb59018fb5a67178c8"} Nov 26 13:48:40 crc kubenswrapper[5200]: I1126 13:48:40.389460 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" event={"ID":"b4cb36a7-b850-428e-8c2a-ffd54329db04","Type":"ContainerStarted","Data":"fb31cb51421ac2cb2e66081bda9e6b0cf98d9d0887e83ec2ae44bc54865b784d"} Nov 26 13:48:40 crc kubenswrapper[5200]: I1126 13:48:40.392274 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:40 crc kubenswrapper[5200]: E1126 13:48:40.400369 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" podUID="3db58e3f-7ff8-42b2-abee-c0b9f7c3d596" Nov 26 13:48:40 crc kubenswrapper[5200]: E1126 13:48:40.400505 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" podUID="4f0ee79e-a3de-4744-86a9-ca2a6b81675c" Nov 26 13:48:40 crc kubenswrapper[5200]: E1126 13:48:40.404204 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" podUID="7f7c5b44-fd61-434d-b66f-6050b7dfa67c" Nov 26 13:48:40 crc kubenswrapper[5200]: E1126 13:48:40.404466 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" podUID="a7d2842d-c7b4-4d44-9734-b8378d3346a2" Nov 26 13:48:40 crc kubenswrapper[5200]: E1126 13:48:40.404678 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" podUID="db4bbd86-c902-4e34-bb73-14f2049c52bc" Nov 26 13:48:40 crc kubenswrapper[5200]: I1126 13:48:40.429370 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" podStartSLOduration=19.429346721 podStartE2EDuration="19.429346721s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:48:40.420795158 +0000 UTC m=+1089.099996996" watchObservedRunningTime="2025-11-26 13:48:40.429346721 +0000 UTC m=+1089.108548549" Nov 26 13:48:42 crc kubenswrapper[5200]: E1126 13:48:42.142857 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167834 actualBytes=10240 Nov 26 13:48:50 crc kubenswrapper[5200]: I1126 13:48:50.395102 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" Nov 26 13:48:50 crc kubenswrapper[5200]: I1126 13:48:50.399210 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" Nov 26 13:48:50 crc kubenswrapper[5200]: I1126 13:48:50.399253 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" Nov 26 13:48:50 crc kubenswrapper[5200]: I1126 13:48:50.399286 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" Nov 26 13:48:50 crc kubenswrapper[5200]: I1126 13:48:50.399314 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" Nov 26 13:48:50 crc kubenswrapper[5200]: I1126 13:48:50.517869 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" event={"ID":"8a630dbf-7443-4cdd-b250-73108e8abeba","Type":"ContainerStarted","Data":"c59c95d66a5d2c155fa87d99d5bf5b83d69d9e481217c669204c5192095cd7de"} Nov 26 13:48:50 crc kubenswrapper[5200]: I1126 13:48:50.520172 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" event={"ID":"3f88a352-bb4b-49cd-9606-de25d1500c22","Type":"ContainerStarted","Data":"bdfb3ddf49c03a8fa525483c9ab21bb37d81b6146c3a40e90fb1597aaf50ea1f"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.537559 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" event={"ID":"8870e7ff-5b48-4a97-a71b-659c3ad81431","Type":"ContainerStarted","Data":"19fccd539285e4c1802a24dc217ec1a614f3e88cfb84cba1664b0c5a673919ba"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.540772 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.548152 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.554616 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" event={"ID":"01344b64-a822-408a-b687-698b296cc8db","Type":"ContainerStarted","Data":"1fadfdf0c1f7220ad044db63b74122f41e0d0fba316317b08195ecdd251367ea"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.555864 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.561607 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.572203 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" event={"ID":"7810ba68-8ec1-4268-99ab-196e7d964387","Type":"ContainerStarted","Data":"89cba5e1d03c34b2effe4fd946e932c4031a0380f1af7c57eaf1d4131c0484de"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.573979 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.574881 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.594226 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-698478755b-twvs8" podStartSLOduration=3.842440651 podStartE2EDuration="30.594207113s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.259116218 +0000 UTC m=+1071.938318036" lastFinishedPulling="2025-11-26 13:48:50.01088267 +0000 UTC m=+1098.690084498" observedRunningTime="2025-11-26 13:48:51.575052168 +0000 UTC m=+1100.254253996" watchObservedRunningTime="2025-11-26 13:48:51.594207113 +0000 UTC m=+1100.273408931" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.594551 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" event={"ID":"e0557c39-59ea-4478-a6e8-ecaed3a311d5","Type":"ContainerStarted","Data":"e85eda9a2700b34292f5f824ea94509452c7d0da368af02560247687b1c3be83"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.594811 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.601029 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.607734 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" event={"ID":"58c1e92c-4f66-475a-aa1a-119d8a9aaa88","Type":"ContainerStarted","Data":"a656818273474a842f53a7a670a26002c30439e734b33185665760006e5767cd"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.612236 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.624774 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.649414 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-7d5b8c9b49-lsh9m" podStartSLOduration=4.149120938 podStartE2EDuration="31.649392751s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:22.498572849 +0000 UTC m=+1071.177774667" lastFinishedPulling="2025-11-26 13:48:49.998844622 +0000 UTC m=+1098.678046480" observedRunningTime="2025-11-26 13:48:51.618594847 +0000 UTC m=+1100.297796665" watchObservedRunningTime="2025-11-26 13:48:51.649392751 +0000 UTC m=+1100.328594569" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.653435 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc6b587b-lkn4w" podStartSLOduration=4.579000278 podStartE2EDuration="31.653422717s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:22.932402264 +0000 UTC m=+1071.611604082" lastFinishedPulling="2025-11-26 13:48:50.006824693 +0000 UTC m=+1098.686026521" observedRunningTime="2025-11-26 13:48:51.649114513 +0000 UTC m=+1100.328316351" watchObservedRunningTime="2025-11-26 13:48:51.653422717 +0000 UTC m=+1100.332624535" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.658897 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" event={"ID":"784e0ea8-032f-4247-87c8-890fb70038f3","Type":"ContainerStarted","Data":"e75a2bc065e96952d19ed3f0232caf0a0f3b7b026a15887b40d279c970f9831e"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.660358 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.662979 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.701169 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" event={"ID":"a7d2842d-c7b4-4d44-9734-b8378d3346a2","Type":"ContainerStarted","Data":"b38f322f37262274aad585de608e942107891d7dff0cb142cad255170bf480d1"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.733783 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" event={"ID":"c151e7d7-c8ea-4c75-b933-6ec3d84201fb","Type":"ContainerStarted","Data":"24499e3465076fd851dce5abd33c7c2512b8556960f79136510626ac4adb7db1"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.734274 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6dbbbff649-m9lc9" podStartSLOduration=4.902461144 podStartE2EDuration="31.734245732s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.180105779 +0000 UTC m=+1071.859307597" lastFinishedPulling="2025-11-26 13:48:50.011890327 +0000 UTC m=+1098.691092185" observedRunningTime="2025-11-26 13:48:51.733220055 +0000 UTC m=+1100.412421883" watchObservedRunningTime="2025-11-26 13:48:51.734245732 +0000 UTC m=+1100.413447550" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.735239 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.746057 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.756782 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" event={"ID":"19d0e851-53ed-41ce-8730-3c8a1349afd6","Type":"ContainerStarted","Data":"daa9cb3e1d3161d644a5265331716922f6aa21010fe38082a2cd9bb52bd5fbfa"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.757680 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.764861 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" event={"ID":"2276e8f6-42e2-462a-92ea-321b00fb4664","Type":"ContainerStarted","Data":"6c3d84c869a8445961df33a3c38703b6e96f3e040b1f45372f62850b08d1b998"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.766239 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.779383 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-664984dfdb-9llqp" podStartSLOduration=4.7057156540000005 podStartE2EDuration="31.779370593s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:22.933579465 +0000 UTC m=+1071.612781283" lastFinishedPulling="2025-11-26 13:48:50.007234384 +0000 UTC m=+1098.686436222" observedRunningTime="2025-11-26 13:48:51.777633677 +0000 UTC m=+1100.456835505" watchObservedRunningTime="2025-11-26 13:48:51.779370593 +0000 UTC m=+1100.458572411" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.795831 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" event={"ID":"67d8cd88-e803-4d94-a43e-4e52da2c2baf","Type":"ContainerStarted","Data":"1cc84b5ed8b400fb9279522403129a289f0ec397e7180c83e9c1403d1e55c394"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.796934 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.810328 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.824026 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" event={"ID":"3db58e3f-7ff8-42b2-abee-c0b9f7c3d596","Type":"ContainerStarted","Data":"0d274dc9fcb10852d6bca5430d5a0c4a452edefe2abf618c4d8fee4a34096352"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.832011 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.857412 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" event={"ID":"d27dbce3-5af6-40a8-960c-4f6965dabcad","Type":"ContainerStarted","Data":"8c68219feb69c985391ad7a25ab412133e59d0a692372999ae68fd64c31a1736"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.858193 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.871932 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-565f56787b-szbjv" podStartSLOduration=4.118253755 podStartE2EDuration="30.871915007s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.25727744 +0000 UTC m=+1071.936479258" lastFinishedPulling="2025-11-26 13:48:50.010938652 +0000 UTC m=+1098.690140510" observedRunningTime="2025-11-26 13:48:51.870081009 +0000 UTC m=+1100.549282837" watchObservedRunningTime="2025-11-26 13:48:51.871915007 +0000 UTC m=+1100.551116815" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.886569 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74f4685-bqfwt" podStartSLOduration=5.143660628 podStartE2EDuration="31.886549154s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.253123902 +0000 UTC m=+1071.932325720" lastFinishedPulling="2025-11-26 13:48:49.996012388 +0000 UTC m=+1098.675214246" observedRunningTime="2025-11-26 13:48:51.830566815 +0000 UTC m=+1100.509768633" watchObservedRunningTime="2025-11-26 13:48:51.886549154 +0000 UTC m=+1100.565750972" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.895169 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" event={"ID":"4f0ee79e-a3de-4744-86a9-ca2a6b81675c","Type":"ContainerStarted","Data":"eb557fe59aacd0ea835beb516149c8bb216d1a5a406b29d7920dda4ec1315f3b"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.911658 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55f4974db-8qmm7" podStartSLOduration=4.018082186 podStartE2EDuration="30.911639946s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.23963978 +0000 UTC m=+1071.918841598" lastFinishedPulling="2025-11-26 13:48:50.13319752 +0000 UTC m=+1098.812399358" observedRunningTime="2025-11-26 13:48:51.910939988 +0000 UTC m=+1100.590141816" watchObservedRunningTime="2025-11-26 13:48:51.911639946 +0000 UTC m=+1100.590841764" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.936355 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" event={"ID":"8a630dbf-7443-4cdd-b250-73108e8abeba","Type":"ContainerStarted","Data":"82bfb35d5925092495401e82daa9e8d24c9ef74b6b0e88a07a7b3c66a45b72f7"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.937126 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.951210 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" event={"ID":"e5a61fb8-33a8-4763-84ad-5973feee71a1","Type":"ContainerStarted","Data":"f1169e3273d1f1286ef63c61975051509641eb5493a3ab9204ae37af89653f0e"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.957904 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.962663 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-57ffb97849-48vc9" podStartSLOduration=17.098201195 podStartE2EDuration="31.960806265s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:21.763014311 +0000 UTC m=+1070.442216129" lastFinishedPulling="2025-11-26 13:48:36.625619381 +0000 UTC m=+1085.304821199" observedRunningTime="2025-11-26 13:48:51.943934769 +0000 UTC m=+1100.623136597" watchObservedRunningTime="2025-11-26 13:48:51.960806265 +0000 UTC m=+1100.640008083" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.971993 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77c8d6d47b-mv8dt" podStartSLOduration=4.766061748 podStartE2EDuration="31.97197305s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:22.904600109 +0000 UTC m=+1071.583801917" lastFinishedPulling="2025-11-26 13:48:50.110511381 +0000 UTC m=+1098.789713219" observedRunningTime="2025-11-26 13:48:51.969271798 +0000 UTC m=+1100.648473616" watchObservedRunningTime="2025-11-26 13:48:51.97197305 +0000 UTC m=+1100.651174868" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.981537 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" event={"ID":"7e8a26f8-ad1f-4de6-a3f7-063c7635d25d","Type":"ContainerStarted","Data":"7256b5e14f916289cf514b92946fbb1175d9c342bcbcffc37bc059f1a4224d71"} Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.982702 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" Nov 26 13:48:51 crc kubenswrapper[5200]: I1126 13:48:51.995994 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.001339 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-784656dc9c-9mbvq" podStartSLOduration=18.017353241 podStartE2EDuration="32.001310864s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.309676255 +0000 UTC m=+1071.988878073" lastFinishedPulling="2025-11-26 13:48:37.293633878 +0000 UTC m=+1085.972835696" observedRunningTime="2025-11-26 13:48:51.996330723 +0000 UTC m=+1100.675532541" watchObservedRunningTime="2025-11-26 13:48:52.001310864 +0000 UTC m=+1100.680512682" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.018222 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" event={"ID":"a32a9a76-fc65-489b-9481-73c9f7ff2870","Type":"ContainerStarted","Data":"c59702cafd9c8217767be30119624e92a3dad3e1f0c72188b811736c3aa8d949"} Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.019080 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.048877 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" event={"ID":"db4bbd86-c902-4e34-bb73-14f2049c52bc","Type":"ContainerStarted","Data":"a7530095569660491969796f8fa5634c6fe6d5fc97af8495d09b435b59cdcbea"} Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.059996 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" podStartSLOduration=5.474892742 podStartE2EDuration="32.059974574s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.321504254 +0000 UTC m=+1072.000706062" lastFinishedPulling="2025-11-26 13:48:49.906586066 +0000 UTC m=+1098.585787894" observedRunningTime="2025-11-26 13:48:52.058175476 +0000 UTC m=+1100.737377304" watchObservedRunningTime="2025-11-26 13:48:52.059974574 +0000 UTC m=+1100.739176392" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.105553 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79976d8764-8xs2k" podStartSLOduration=17.450752033 podStartE2EDuration="32.105530437s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:22.007556833 +0000 UTC m=+1070.686758651" lastFinishedPulling="2025-11-26 13:48:36.662335237 +0000 UTC m=+1085.341537055" observedRunningTime="2025-11-26 13:48:52.10339574 +0000 UTC m=+1100.782597558" watchObservedRunningTime="2025-11-26 13:48:52.105530437 +0000 UTC m=+1100.784732255" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.267726 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7dd76969c9-ldrsq" podStartSLOduration=17.905443238 podStartE2EDuration="32.267696409s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:22.932411024 +0000 UTC m=+1071.611612842" lastFinishedPulling="2025-11-26 13:48:37.294664175 +0000 UTC m=+1085.973866013" observedRunningTime="2025-11-26 13:48:52.164102263 +0000 UTC m=+1100.843304081" watchObservedRunningTime="2025-11-26 13:48:52.267696409 +0000 UTC m=+1100.946898227" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.268259 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" podStartSLOduration=5.64973279 podStartE2EDuration="32.268253744s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.288281068 +0000 UTC m=+1071.967482886" lastFinishedPulling="2025-11-26 13:48:49.906801982 +0000 UTC m=+1098.586003840" observedRunningTime="2025-11-26 13:48:52.205916928 +0000 UTC m=+1100.885118766" watchObservedRunningTime="2025-11-26 13:48:52.268253744 +0000 UTC m=+1100.947455552" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.271383 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7ddc654f5-4nzdn" podStartSLOduration=4.655558312 podStartE2EDuration="31.271374276s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.291024549 +0000 UTC m=+1071.970226367" lastFinishedPulling="2025-11-26 13:48:49.906840473 +0000 UTC m=+1098.586042331" observedRunningTime="2025-11-26 13:48:52.230657401 +0000 UTC m=+1100.909859219" watchObservedRunningTime="2025-11-26 13:48:52.271374276 +0000 UTC m=+1100.950576094" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.295846 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" podStartSLOduration=24.617964386 podStartE2EDuration="31.295821632s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:38.276258365 +0000 UTC m=+1086.955460183" lastFinishedPulling="2025-11-26 13:48:44.954115571 +0000 UTC m=+1093.633317429" observedRunningTime="2025-11-26 13:48:52.274656033 +0000 UTC m=+1100.953857851" watchObservedRunningTime="2025-11-26 13:48:52.295821632 +0000 UTC m=+1100.975023450" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.319641 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" podStartSLOduration=4.72966513 podStartE2EDuration="31.31961692s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.316697208 +0000 UTC m=+1071.995899026" lastFinishedPulling="2025-11-26 13:48:49.906648998 +0000 UTC m=+1098.585850816" observedRunningTime="2025-11-26 13:48:52.315958994 +0000 UTC m=+1100.995160822" watchObservedRunningTime="2025-11-26 13:48:52.31961692 +0000 UTC m=+1100.998818728" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.419719 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-68d9dc8f85-vzjn8" Nov 26 13:48:52 crc kubenswrapper[5200]: I1126 13:48:52.446982 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" podStartSLOduration=20.496769463 podStartE2EDuration="32.446954143s" podCreationTimestamp="2025-11-26 13:48:20 +0000 UTC" firstStartedPulling="2025-11-26 13:48:38.022933853 +0000 UTC m=+1086.702135671" lastFinishedPulling="2025-11-26 13:48:49.973118503 +0000 UTC m=+1098.652320351" observedRunningTime="2025-11-26 13:48:52.353135936 +0000 UTC m=+1101.032337754" watchObservedRunningTime="2025-11-26 13:48:52.446954143 +0000 UTC m=+1101.126155961" Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.062207 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" event={"ID":"7f7c5b44-fd61-434d-b66f-6050b7dfa67c","Type":"ContainerStarted","Data":"c18c922dd5f40bfa5a0c0be6c3438e3056bcbc352f56b1c2ed1ab29e4eb7cdbc"} Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.066961 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" event={"ID":"3f88a352-bb4b-49cd-9606-de25d1500c22","Type":"ContainerStarted","Data":"504a3b3e952e8cf4f68caa6b9910c0a8ee7fc19ea4933a9543dc0a90d4bb735a"} Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.067935 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.071274 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" event={"ID":"19d0e851-53ed-41ce-8730-3c8a1349afd6","Type":"ContainerStarted","Data":"1093a2d74daca29f45b00275a0d1b5cf9c7e013365c6d2335a4f1da863cab5b3"} Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.075728 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" event={"ID":"d27dbce3-5af6-40a8-960c-4f6965dabcad","Type":"ContainerStarted","Data":"8afb91a2d10c8a3ec3e3bed4cfb3494d21d9cde8aec52d7f1c66ae0737d47fd1"} Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.079610 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" event={"ID":"e5a61fb8-33a8-4763-84ad-5973feee71a1","Type":"ContainerStarted","Data":"438f49446dc21bf27cce0514fff5d2db4563e11f916064dcd16f0526f2309286"} Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.082958 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" event={"ID":"a32a9a76-fc65-489b-9481-73c9f7ff2870","Type":"ContainerStarted","Data":"fa93a19ccc7c1b5e684f2946683da96139535f94c6935f2016887c48eaa271cc"} Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.094252 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-89776cb76-99p75" podStartSLOduration=18.70041051 podStartE2EDuration="32.094219105s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.277319762 +0000 UTC m=+1071.956521580" lastFinishedPulling="2025-11-26 13:48:36.671128327 +0000 UTC m=+1085.350330175" observedRunningTime="2025-11-26 13:48:53.090064526 +0000 UTC m=+1101.769266374" watchObservedRunningTime="2025-11-26 13:48:53.094219105 +0000 UTC m=+1101.773420933" Nov 26 13:48:53 crc kubenswrapper[5200]: I1126 13:48:53.130041 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" podStartSLOduration=5.5008732 podStartE2EDuration="32.130011461s" podCreationTimestamp="2025-11-26 13:48:21 +0000 UTC" firstStartedPulling="2025-11-26 13:48:23.277885437 +0000 UTC m=+1071.957087255" lastFinishedPulling="2025-11-26 13:48:49.907023698 +0000 UTC m=+1098.586225516" observedRunningTime="2025-11-26 13:48:53.12656523 +0000 UTC m=+1101.805767058" watchObservedRunningTime="2025-11-26 13:48:53.130011461 +0000 UTC m=+1101.809213319" Nov 26 13:49:04 crc kubenswrapper[5200]: I1126 13:49:04.099814 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-696667c874-6stq7" Nov 26 13:49:04 crc kubenswrapper[5200]: I1126 13:49:04.102970 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5bdb4cd4b-d8x72" Nov 26 13:49:04 crc kubenswrapper[5200]: I1126 13:49:04.106121 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv" Nov 26 13:49:04 crc kubenswrapper[5200]: I1126 13:49:04.111563 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9cc5b6bd4-kksm2" Nov 26 13:49:04 crc kubenswrapper[5200]: I1126 13:49:04.112109 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6584fc9d7c-nfb9x" Nov 26 13:49:04 crc kubenswrapper[5200]: I1126 13:49:04.112513 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5c76888db5-cfxkz" Nov 26 13:49:20 crc kubenswrapper[5200]: I1126 13:49:20.927004 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-x97v9"] Nov 26 13:49:20 crc kubenswrapper[5200]: I1126 13:49:20.941519 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:20 crc kubenswrapper[5200]: I1126 13:49:20.945722 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dnsmasq-dns-dockercfg-vm792\"" Nov 26 13:49:20 crc kubenswrapper[5200]: I1126 13:49:20.946231 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openshift-service-ca.crt\"" Nov 26 13:49:20 crc kubenswrapper[5200]: I1126 13:49:20.946239 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"dns\"" Nov 26 13:49:20 crc kubenswrapper[5200]: I1126 13:49:20.948981 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"kube-root-ca.crt\"" Nov 26 13:49:20 crc kubenswrapper[5200]: I1126 13:49:20.950608 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-x97v9"] Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.017018 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/912eca56-ca63-4646-8546-98708ceaee05-config\") pod \"dnsmasq-dns-5dbf879849-x97v9\" (UID: \"912eca56-ca63-4646-8546-98708ceaee05\") " pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.017084 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8s5q\" (UniqueName: \"kubernetes.io/projected/912eca56-ca63-4646-8546-98708ceaee05-kube-api-access-f8s5q\") pod \"dnsmasq-dns-5dbf879849-x97v9\" (UID: \"912eca56-ca63-4646-8546-98708ceaee05\") " pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.041291 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-6khkp"] Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.056158 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.058277 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-6khkp"] Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.061989 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"dns-svc\"" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.120848 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q6m2\" (UniqueName: \"kubernetes.io/projected/8f127dad-fe57-4c5f-9029-b884f62ea84b-kube-api-access-8q6m2\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.120945 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/912eca56-ca63-4646-8546-98708ceaee05-config\") pod \"dnsmasq-dns-5dbf879849-x97v9\" (UID: \"912eca56-ca63-4646-8546-98708ceaee05\") " pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.121020 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8s5q\" (UniqueName: \"kubernetes.io/projected/912eca56-ca63-4646-8546-98708ceaee05-kube-api-access-f8s5q\") pod \"dnsmasq-dns-5dbf879849-x97v9\" (UID: \"912eca56-ca63-4646-8546-98708ceaee05\") " pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.121051 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-dns-svc\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.121136 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-config\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.123709 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/912eca56-ca63-4646-8546-98708ceaee05-config\") pod \"dnsmasq-dns-5dbf879849-x97v9\" (UID: \"912eca56-ca63-4646-8546-98708ceaee05\") " pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.163805 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8s5q\" (UniqueName: \"kubernetes.io/projected/912eca56-ca63-4646-8546-98708ceaee05-kube-api-access-f8s5q\") pod \"dnsmasq-dns-5dbf879849-x97v9\" (UID: \"912eca56-ca63-4646-8546-98708ceaee05\") " pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.222751 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-config\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.222843 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q6m2\" (UniqueName: \"kubernetes.io/projected/8f127dad-fe57-4c5f-9029-b884f62ea84b-kube-api-access-8q6m2\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.222906 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-dns-svc\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.223992 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-dns-svc\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.224049 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-config\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.243764 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q6m2\" (UniqueName: \"kubernetes.io/projected/8f127dad-fe57-4c5f-9029-b884f62ea84b-kube-api-access-8q6m2\") pod \"dnsmasq-dns-6f8cccd557-6khkp\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.267856 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.392829 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.552110 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-x97v9"] Nov 26 13:49:21 crc kubenswrapper[5200]: I1126 13:49:21.691585 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-6khkp"] Nov 26 13:49:21 crc kubenswrapper[5200]: W1126 13:49:21.698115 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f127dad_fe57_4c5f_9029_b884f62ea84b.slice/crio-cd191995906b19ad00febf7714207edb487ed315521976e9e343bff03b5a9689 WatchSource:0}: Error finding container cd191995906b19ad00febf7714207edb487ed315521976e9e343bff03b5a9689: Status 404 returned error can't find the container with id cd191995906b19ad00febf7714207edb487ed315521976e9e343bff03b5a9689 Nov 26 13:49:22 crc kubenswrapper[5200]: I1126 13:49:22.431998 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" event={"ID":"8f127dad-fe57-4c5f-9029-b884f62ea84b","Type":"ContainerStarted","Data":"cd191995906b19ad00febf7714207edb487ed315521976e9e343bff03b5a9689"} Nov 26 13:49:22 crc kubenswrapper[5200]: I1126 13:49:22.433803 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbf879849-x97v9" event={"ID":"912eca56-ca63-4646-8546-98708ceaee05","Type":"ContainerStarted","Data":"53754a07e960b3208447210cc9ca129b5050fe17ad4691a295db0a3509bd188a"} Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.117237 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-x97v9"] Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.137256 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-7s2td"] Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.205763 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-7s2td"] Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.205870 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.375388 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-dns-svc\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.375500 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-config\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.375565 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqdn\" (UniqueName: \"kubernetes.io/projected/9297d5cc-4467-41e9-961c-0b86e92deff8-kube-api-access-2qqdn\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.477058 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-config\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.477161 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qqdn\" (UniqueName: \"kubernetes.io/projected/9297d5cc-4467-41e9-961c-0b86e92deff8-kube-api-access-2qqdn\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.477233 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-dns-svc\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.478892 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-dns-svc\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.479086 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-config\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.499804 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qqdn\" (UniqueName: \"kubernetes.io/projected/9297d5cc-4467-41e9-961c-0b86e92deff8-kube-api-access-2qqdn\") pod \"dnsmasq-dns-588bd8c8c5-7s2td\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.566753 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.939666 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-6khkp"] Nov 26 13:49:23 crc kubenswrapper[5200]: I1126 13:49:23.980288 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-pmsfs"] Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.015468 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-pmsfs"] Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.015653 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.191409 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-dns-svc\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.191502 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-config\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.192283 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbf29\" (UniqueName: \"kubernetes.io/projected/94740f8f-ad62-4ced-a015-703d5429f0a2-kube-api-access-tbf29\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.227565 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-7s2td"] Nov 26 13:49:24 crc kubenswrapper[5200]: W1126 13:49:24.232797 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9297d5cc_4467_41e9_961c_0b86e92deff8.slice/crio-42da4b5ecf0bb0b99c3a329434aa3e06a6b51586dc76dc80cf8fa5abef155074 WatchSource:0}: Error finding container 42da4b5ecf0bb0b99c3a329434aa3e06a6b51586dc76dc80cf8fa5abef155074: Status 404 returned error can't find the container with id 42da4b5ecf0bb0b99c3a329434aa3e06a6b51586dc76dc80cf8fa5abef155074 Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.290845 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.299026 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-dns-svc\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.299125 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-config\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.299197 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbf29\" (UniqueName: \"kubernetes.io/projected/94740f8f-ad62-4ced-a015-703d5429f0a2-kube-api-access-tbf29\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.301396 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-dns-svc\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.302043 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-config\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.307394 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.307587 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.333380 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-default-user\"" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.333754 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-config-data\"" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.333891 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-plugins-conf\"" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.334277 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-erlang-cookie\"" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.334624 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-rabbitmq-cell1-svc\"" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.334833 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-conf\"" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.338094 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-dockercfg-vkfbh\"" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.361008 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbf29\" (UniqueName: \"kubernetes.io/projected/94740f8f-ad62-4ced-a015-703d5429f0a2-kube-api-access-tbf29\") pod \"dnsmasq-dns-6686bbb8b9-pmsfs\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.463994 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" event={"ID":"9297d5cc-4467-41e9-961c-0b86e92deff8","Type":"ContainerStarted","Data":"42da4b5ecf0bb0b99c3a329434aa3e06a6b51586dc76dc80cf8fa5abef155074"} Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.507658 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.507797 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.507857 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c5390ed-d89e-4450-adf3-1363b1150d1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.507898 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.507955 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plp4v\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-kube-api-access-plp4v\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.507984 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.508014 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.508073 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.508098 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c5390ed-d89e-4450-adf3-1363b1150d1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.508600 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.508639 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.610532 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.611092 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c5390ed-d89e-4450-adf3-1363b1150d1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.612292 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.611537 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.612759 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.612915 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plp4v\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-kube-api-access-plp4v\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.614806 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.614843 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.615711 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.616366 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.616432 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.616462 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c5390ed-d89e-4450-adf3-1363b1150d1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.616530 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.616583 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.617387 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.618718 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c5390ed-d89e-4450-adf3-1363b1150d1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.618832 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.623776 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.625133 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c5390ed-d89e-4450-adf3-1363b1150d1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.631591 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.632271 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.640463 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plp4v\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-kube-api-access-plp4v\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.649396 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.676206 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:24 crc kubenswrapper[5200]: I1126 13:49:24.691766 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.022356 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.110671 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.118318 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.125208 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-pmsfs"] Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.125889 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-default-user\"" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.126149 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-erlang-cookie\"" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.126300 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-server-conf\"" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.126533 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-server-dockercfg-gc5kk\"" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.127294 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-plugins-conf\"" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.128068 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-config-data\"" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.128558 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-rabbitmq-svc\"" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.163208 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230410 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230471 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf4bs\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-kube-api-access-wf4bs\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230507 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230537 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230573 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230589 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f5ea8d4-ba2b-4abe-b645-0513268421f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230615 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f5ea8d4-ba2b-4abe-b645-0513268421f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230640 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230808 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230850 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.230902 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333163 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333277 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333361 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333420 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333455 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf4bs\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-kube-api-access-wf4bs\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333528 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333602 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333719 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333749 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f5ea8d4-ba2b-4abe-b645-0513268421f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333815 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f5ea8d4-ba2b-4abe-b645-0513268421f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.333888 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.335271 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.336041 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.336471 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.336664 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.337608 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.341817 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.353140 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f5ea8d4-ba2b-4abe-b645-0513268421f9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.353271 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f5ea8d4-ba2b-4abe-b645-0513268421f9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.355769 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.362499 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.366738 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.368321 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf4bs\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-kube-api-access-wf4bs\") pod \"rabbitmq-server-0\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.445997 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.493060 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c5390ed-d89e-4450-adf3-1363b1150d1b","Type":"ContainerStarted","Data":"f3f4a3c3e89a0ab7552cee34135566562569879c24cc8008070380114ae9665e"} Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.504483 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" event={"ID":"94740f8f-ad62-4ced-a015-703d5429f0a2","Type":"ContainerStarted","Data":"559e1d16dfc3c0238b3d6cc3bb68e97028015c2dd74e3721572d72fb876958a5"} Nov 26 13:49:25 crc kubenswrapper[5200]: I1126 13:49:25.960735 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.336526 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.349866 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.354754 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"galera-openstack-dockercfg-jckgj\"" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.355082 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.375384 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-config-data\"" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.376190 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-scripts\"" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.379227 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-galera-openstack-svc\"" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.379883 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"combined-ca-bundle\"" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.454771 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.454842 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-kolla-config\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.454868 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2xxz\" (UniqueName: \"kubernetes.io/projected/27a62b3a-c69a-47fa-9642-425931d81827-kube-api-access-n2xxz\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.454944 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.455013 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.455078 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27a62b3a-c69a-47fa-9642-425931d81827-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.455099 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.455199 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-config-data-default\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.524372 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5f5ea8d4-ba2b-4abe-b645-0513268421f9","Type":"ContainerStarted","Data":"963f4c0429a8a71b49f8ba43583a756c78417e5f61ebaacc9a04799d278f16fb"} Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.558304 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27a62b3a-c69a-47fa-9642-425931d81827-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.558364 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.558403 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-config-data-default\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.558666 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.558704 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-kolla-config\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.558725 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2xxz\" (UniqueName: \"kubernetes.io/projected/27a62b3a-c69a-47fa-9642-425931d81827-kube-api-access-n2xxz\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.559204 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27a62b3a-c69a-47fa-9642-425931d81827-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.560490 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-kolla-config\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.560525 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-config-data-default\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.560554 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.560889 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.561083 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.562899 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.581513 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.585840 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.590613 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2xxz\" (UniqueName: \"kubernetes.io/projected/27a62b3a-c69a-47fa-9642-425931d81827-kube-api-access-n2xxz\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.644317 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " pod="openstack/openstack-galera-0" Nov 26 13:49:26 crc kubenswrapper[5200]: I1126 13:49:26.698666 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.697311 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.751837 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.752040 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.757942 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-cell1-config-data\"" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.758272 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-cell1-scripts\"" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.758955 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"galera-openstack-cell1-dockercfg-svnbz\"" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.759255 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-galera-openstack-cell1-svc\"" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.927874 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.927959 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.928005 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.928043 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.928086 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.928109 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.928171 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.928233 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfb24\" (UniqueName: \"kubernetes.io/projected/937b077a-80f5-4f02-a636-68791a2cfe0f-kube-api-access-vfb24\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.943922 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.965249 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.967751 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.973478 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-memcached-svc\"" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.973853 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"memcached-config-data\"" Nov 26 13:49:27 crc kubenswrapper[5200]: I1126 13:49:27.979927 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"memcached-memcached-dockercfg-z4mf5\"" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.031098 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.031155 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.031211 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.031245 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.031266 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.031285 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.031328 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.031799 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.032844 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.033367 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.033459 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.033814 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfb24\" (UniqueName: \"kubernetes.io/projected/937b077a-80f5-4f02-a636-68791a2cfe0f-kube-api-access-vfb24\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.034440 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.048580 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.053474 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.073556 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfb24\" (UniqueName: \"kubernetes.io/projected/937b077a-80f5-4f02-a636-68791a2cfe0f-kube-api-access-vfb24\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.101476 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.137213 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-config-data\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.137636 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.137744 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.137803 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-kolla-config\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.137844 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvhqs\" (UniqueName: \"kubernetes.io/projected/55e57b83-77f4-446e-bc0a-c78f9094aacb-kube-api-access-jvhqs\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.239469 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.239560 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-kolla-config\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.239605 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvhqs\" (UniqueName: \"kubernetes.io/projected/55e57b83-77f4-446e-bc0a-c78f9094aacb-kube-api-access-jvhqs\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.239651 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-config-data\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.239671 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.240805 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-config-data\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.241503 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-kolla-config\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.252697 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.252914 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.266099 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvhqs\" (UniqueName: \"kubernetes.io/projected/55e57b83-77f4-446e-bc0a-c78f9094aacb-kube-api-access-jvhqs\") pod \"memcached-0\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.306782 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 13:49:28 crc kubenswrapper[5200]: I1126 13:49:28.395195 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 13:49:30 crc kubenswrapper[5200]: I1126 13:49:30.072693 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:49:30 crc kubenswrapper[5200]: I1126 13:49:30.085513 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 13:49:30 crc kubenswrapper[5200]: I1126 13:49:30.088711 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"telemetry-ceilometer-dockercfg-mwj5x\"" Nov 26 13:49:30 crc kubenswrapper[5200]: I1126 13:49:30.089633 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:49:30 crc kubenswrapper[5200]: I1126 13:49:30.291208 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jh9f\" (UniqueName: \"kubernetes.io/projected/20a99fd8-1db7-48a0-ad37-5652b13c7e00-kube-api-access-2jh9f\") pod \"kube-state-metrics-0\" (UID: \"20a99fd8-1db7-48a0-ad37-5652b13c7e00\") " pod="openstack/kube-state-metrics-0" Nov 26 13:49:30 crc kubenswrapper[5200]: I1126 13:49:30.392823 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jh9f\" (UniqueName: \"kubernetes.io/projected/20a99fd8-1db7-48a0-ad37-5652b13c7e00-kube-api-access-2jh9f\") pod \"kube-state-metrics-0\" (UID: \"20a99fd8-1db7-48a0-ad37-5652b13c7e00\") " pod="openstack/kube-state-metrics-0" Nov 26 13:49:30 crc kubenswrapper[5200]: I1126 13:49:30.426953 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jh9f\" (UniqueName: \"kubernetes.io/projected/20a99fd8-1db7-48a0-ad37-5652b13c7e00-kube-api-access-2jh9f\") pod \"kube-state-metrics-0\" (UID: \"20a99fd8-1db7-48a0-ad37-5652b13c7e00\") " pod="openstack/kube-state-metrics-0" Nov 26 13:49:30 crc kubenswrapper[5200]: I1126 13:49:30.708793 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.576118 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kp5s8"] Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.618133 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qk25n"] Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.623871 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.627573 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovncontroller-ovndbs\"" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.627874 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncontroller-ovncontroller-dockercfg-5ct9s\"" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.629884 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-scripts\"" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.631189 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kp5s8"] Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.631219 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qk25n"] Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.631338 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.633158 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-ovn-controller-tls-certs\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.633250 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run-ovn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.633281 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-log-ovn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.633308 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.633482 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-combined-ca-bundle\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.633777 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btmcn\" (UniqueName: \"kubernetes.io/projected/743192ea-7611-42dc-ab35-0dd584590479-kube-api-access-btmcn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.633869 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/743192ea-7611-42dc-ab35-0dd584590479-scripts\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736015 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-lib\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736063 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-log\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736092 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/257070be-da5f-4601-b466-b9f7c8c8f153-scripts\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736300 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-ovn-controller-tls-certs\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736456 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run-ovn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736491 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-log-ovn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736537 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6l7\" (UniqueName: \"kubernetes.io/projected/257070be-da5f-4601-b466-b9f7c8c8f153-kube-api-access-7n6l7\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736590 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736696 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-combined-ca-bundle\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736775 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-etc-ovs\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736920 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btmcn\" (UniqueName: \"kubernetes.io/projected/743192ea-7611-42dc-ab35-0dd584590479-kube-api-access-btmcn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.736948 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-run\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.737041 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/743192ea-7611-42dc-ab35-0dd584590479-scripts\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.737189 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run-ovn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.737375 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.740257 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-log-ovn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.744841 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-ovn-controller-tls-certs\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.746662 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-combined-ca-bundle\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.754828 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/743192ea-7611-42dc-ab35-0dd584590479-scripts\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.764372 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btmcn\" (UniqueName: \"kubernetes.io/projected/743192ea-7611-42dc-ab35-0dd584590479-kube-api-access-btmcn\") pod \"ovn-controller-kp5s8\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.839282 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6l7\" (UniqueName: \"kubernetes.io/projected/257070be-da5f-4601-b466-b9f7c8c8f153-kube-api-access-7n6l7\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.839656 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-etc-ovs\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.839719 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-run\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.840615 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-etc-ovs\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.840698 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-run\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.845878 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-lib\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.846033 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-log\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.846116 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/257070be-da5f-4601-b466-b9f7c8c8f153-scripts\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.848774 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/257070be-da5f-4601-b466-b9f7c8c8f153-scripts\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.848991 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-lib\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.849081 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-log\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.877224 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6l7\" (UniqueName: \"kubernetes.io/projected/257070be-da5f-4601-b466-b9f7c8c8f153-kube-api-access-7n6l7\") pod \"ovn-controller-ovs-qk25n\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.959313 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:32 crc kubenswrapper[5200]: I1126 13:49:32.972622 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.699262 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.782956 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.783151 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.787020 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovn-metrics\"" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.787433 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-nb-config\"" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.787751 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-nb-scripts\"" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.787943 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncluster-ovndbcluster-nb-dockercfg-gcpt9\"" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.787643 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovndbcluster-nb-ovndbs\"" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.893322 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.893731 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.893758 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.893790 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.893811 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5xvj\" (UniqueName: \"kubernetes.io/projected/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-kube-api-access-f5xvj\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.894000 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.894403 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.895062 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-config\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.997616 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.998752 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-config\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.998906 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.999346 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.999385 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.999425 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:34 crc kubenswrapper[5200]: I1126 13:49:34.999446 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5xvj\" (UniqueName: \"kubernetes.io/projected/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-kube-api-access-f5xvj\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:34.998351 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.001941 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.006598 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.007453 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-config\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.007846 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.013102 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.013368 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.015202 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.023335 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5xvj\" (UniqueName: \"kubernetes.io/projected/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-kube-api-access-f5xvj\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.029772 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:35 crc kubenswrapper[5200]: I1126 13:49:35.111398 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:37 crc kubenswrapper[5200]: I1126 13:49:37.994923 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.163129 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.163302 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.166548 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovndbcluster-sb-ovndbs\"" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.167035 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncluster-ovndbcluster-sb-dockercfg-9z9vp\"" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.168083 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-sb-config\"" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.168107 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-sb-scripts\"" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.279067 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.279145 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.279201 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.279517 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.279845 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th8hr\" (UniqueName: \"kubernetes.io/projected/cb616b41-7836-4e58-a88b-bb68df018d24-kube-api-access-th8hr\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.280067 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.280107 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-config\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.280238 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.381802 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th8hr\" (UniqueName: \"kubernetes.io/projected/cb616b41-7836-4e58-a88b-bb68df018d24-kube-api-access-th8hr\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.381911 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.381947 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-config\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.381994 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.382482 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.382526 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.382617 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.383667 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.384479 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.383908 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-config\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.384725 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.385132 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.392997 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.393060 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.403398 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.412431 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th8hr\" (UniqueName: \"kubernetes.io/projected/cb616b41-7836-4e58-a88b-bb68df018d24-kube-api-access-th8hr\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.436256 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:38 crc kubenswrapper[5200]: I1126 13:49:38.505193 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:41 crc kubenswrapper[5200]: I1126 13:49:41.986431 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:49:42 crc kubenswrapper[5200]: E1126 13:49:42.424406 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1303145 actualBytes=10240 Nov 26 13:49:42 crc kubenswrapper[5200]: W1126 13:49:42.655284 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20a99fd8_1db7_48a0_ad37_5652b13c7e00.slice/crio-f77ad4e3b078a74ae55351a8b70f4fa4f5e882e2ceff88b3b28eac508632a06e WatchSource:0}: Error finding container f77ad4e3b078a74ae55351a8b70f4fa4f5e882e2ceff88b3b28eac508632a06e: Status 404 returned error can't find the container with id f77ad4e3b078a74ae55351a8b70f4fa4f5e882e2ceff88b3b28eac508632a06e Nov 26 13:49:42 crc kubenswrapper[5200]: I1126 13:49:42.710044 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"20a99fd8-1db7-48a0-ad37-5652b13c7e00","Type":"ContainerStarted","Data":"f77ad4e3b078a74ae55351a8b70f4fa4f5e882e2ceff88b3b28eac508632a06e"} Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.141128 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 13:49:43 crc kubenswrapper[5200]: W1126 13:49:43.163175 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27a62b3a_c69a_47fa_9642_425931d81827.slice/crio-49e6428f30ffa795a18306705bd3fe6ce12473027bed514dcdf8505f5e67e50e WatchSource:0}: Error finding container 49e6428f30ffa795a18306705bd3fe6ce12473027bed514dcdf8505f5e67e50e: Status 404 returned error can't find the container with id 49e6428f30ffa795a18306705bd3fe6ce12473027bed514dcdf8505f5e67e50e Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.252455 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.263784 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kp5s8"] Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.272502 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 26 13:49:43 crc kubenswrapper[5200]: W1126 13:49:43.281280 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod743192ea_7611_42dc_ab35_0dd584590479.slice/crio-e0809c7e4b7929c0761155dec6ef452bd7bcc964baea60a35d2ea6007dd719c9 WatchSource:0}: Error finding container e0809c7e4b7929c0761155dec6ef452bd7bcc964baea60a35d2ea6007dd719c9: Status 404 returned error can't find the container with id e0809c7e4b7929c0761155dec6ef452bd7bcc964baea60a35d2ea6007dd719c9 Nov 26 13:49:43 crc kubenswrapper[5200]: W1126 13:49:43.281790 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55e57b83_77f4_446e_bc0a_c78f9094aacb.slice/crio-bc3f8dee613dcc94578135d14352848e76bf8c58cc711537db2bb71c55a99bc0 WatchSource:0}: Error finding container bc3f8dee613dcc94578135d14352848e76bf8c58cc711537db2bb71c55a99bc0: Status 404 returned error can't find the container with id bc3f8dee613dcc94578135d14352848e76bf8c58cc711537db2bb71c55a99bc0 Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.601377 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qk25n"] Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.716569 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 13:49:43 crc kubenswrapper[5200]: W1126 13:49:43.721485 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb616b41_7836_4e58_a88b_bb68df018d24.slice/crio-ecbbbbf41072a038578703834cff83b26f4c2b1fa187c491036e99ce716e9272 WatchSource:0}: Error finding container ecbbbbf41072a038578703834cff83b26f4c2b1fa187c491036e99ce716e9272: Status 404 returned error can't find the container with id ecbbbbf41072a038578703834cff83b26f4c2b1fa187c491036e99ce716e9272 Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.721993 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"937b077a-80f5-4f02-a636-68791a2cfe0f","Type":"ContainerStarted","Data":"cdc8cdf94f4ea78ae1d9c535b352b21a0dec18159d1e7a4725c1f8f6049efcb7"} Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.725408 5200 generic.go:358] "Generic (PLEG): container finished" podID="8f127dad-fe57-4c5f-9029-b884f62ea84b" containerID="33df5bd4b2cec136d6fa4380cb87986e9cbf0b16e58c4ada575e6b71e4462a91" exitCode=0 Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.725546 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" event={"ID":"8f127dad-fe57-4c5f-9029-b884f62ea84b","Type":"ContainerDied","Data":"33df5bd4b2cec136d6fa4380cb87986e9cbf0b16e58c4ada575e6b71e4462a91"} Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.732233 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qk25n" event={"ID":"257070be-da5f-4601-b466-b9f7c8c8f153","Type":"ContainerStarted","Data":"c03d27d3370349ea15d3c4f499299d5e80b62d8c1de43874581cb87ebcce774e"} Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.736746 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27a62b3a-c69a-47fa-9642-425931d81827","Type":"ContainerStarted","Data":"49e6428f30ffa795a18306705bd3fe6ce12473027bed514dcdf8505f5e67e50e"} Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.738400 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" event={"ID":"94740f8f-ad62-4ced-a015-703d5429f0a2","Type":"ContainerStarted","Data":"4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7"} Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.740164 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbf879849-x97v9" event={"ID":"912eca56-ca63-4646-8546-98708ceaee05","Type":"ContainerStarted","Data":"064864007fbe8b0913fee5d6b15af42d74f49411a218e0d2f184f9dee98627b3"} Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.742201 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"55e57b83-77f4-446e-bc0a-c78f9094aacb","Type":"ContainerStarted","Data":"bc3f8dee613dcc94578135d14352848e76bf8c58cc711537db2bb71c55a99bc0"} Nov 26 13:49:43 crc kubenswrapper[5200]: I1126 13:49:43.746748 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kp5s8" event={"ID":"743192ea-7611-42dc-ab35-0dd584590479","Type":"ContainerStarted","Data":"e0809c7e4b7929c0761155dec6ef452bd7bcc964baea60a35d2ea6007dd719c9"} Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.088748 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.216551 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q6m2\" (UniqueName: \"kubernetes.io/projected/8f127dad-fe57-4c5f-9029-b884f62ea84b-kube-api-access-8q6m2\") pod \"8f127dad-fe57-4c5f-9029-b884f62ea84b\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.218103 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-config\") pod \"8f127dad-fe57-4c5f-9029-b884f62ea84b\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.218414 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-dns-svc\") pod \"8f127dad-fe57-4c5f-9029-b884f62ea84b\" (UID: \"8f127dad-fe57-4c5f-9029-b884f62ea84b\") " Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.310600 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f127dad-fe57-4c5f-9029-b884f62ea84b" (UID: "8f127dad-fe57-4c5f-9029-b884f62ea84b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.324783 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.352882 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8vf92"] Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.354379 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f127dad-fe57-4c5f-9029-b884f62ea84b" containerName="init" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.354405 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f127dad-fe57-4c5f-9029-b884f62ea84b" containerName="init" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.354643 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f127dad-fe57-4c5f-9029-b884f62ea84b" containerName="init" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.379959 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f127dad-fe57-4c5f-9029-b884f62ea84b-kube-api-access-8q6m2" (OuterVolumeSpecName: "kube-api-access-8q6m2") pod "8f127dad-fe57-4c5f-9029-b884f62ea84b" (UID: "8f127dad-fe57-4c5f-9029-b884f62ea84b"). InnerVolumeSpecName "kube-api-access-8q6m2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.414822 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-config" (OuterVolumeSpecName: "config") pod "8f127dad-fe57-4c5f-9029-b884f62ea84b" (UID: "8f127dad-fe57-4c5f-9029-b884f62ea84b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.427125 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f127dad-fe57-4c5f-9029-b884f62ea84b-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.427184 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8q6m2\" (UniqueName: \"kubernetes.io/projected/8f127dad-fe57-4c5f-9029-b884f62ea84b-kube-api-access-8q6m2\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.762138 5200 generic.go:358] "Generic (PLEG): container finished" podID="94740f8f-ad62-4ced-a015-703d5429f0a2" containerID="4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7" exitCode=0 Nov 26 13:49:44 crc kubenswrapper[5200]: I1126 13:49:44.765236 5200 generic.go:358] "Generic (PLEG): container finished" podID="912eca56-ca63-4646-8546-98708ceaee05" containerID="064864007fbe8b0913fee5d6b15af42d74f49411a218e0d2f184f9dee98627b3" exitCode=0 Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.064530 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.064575 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.068715 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-metrics-config\"" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.080771 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8vf92"] Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.080830 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-7s2td"] Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.080857 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" event={"ID":"8f127dad-fe57-4c5f-9029-b884f62ea84b","Type":"ContainerDied","Data":"cd191995906b19ad00febf7714207edb487ed315521976e9e343bff03b5a9689"} Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.080901 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" event={"ID":"9297d5cc-4467-41e9-961c-0b86e92deff8","Type":"ContainerStarted","Data":"acbe50c8399d11de4efad6d2497ddd488d54b33ac6ccc0c0d4fcef7abe234c29"} Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.080936 5200 scope.go:117] "RemoveContainer" containerID="33df5bd4b2cec136d6fa4380cb87986e9cbf0b16e58c4ada575e6b71e4462a91" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.080919 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5fdfb5df-2jj2g"] Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.139902 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82b72f68-40bd-4a81-914a-dda4355b9fc4-config\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.140344 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-combined-ca-bundle\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.140459 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.140717 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf5tk\" (UniqueName: \"kubernetes.io/projected/82b72f68-40bd-4a81-914a-dda4355b9fc4-kube-api-access-hf5tk\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.143642 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovn-rundir\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.143937 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovs-rundir\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.245783 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovn-rundir\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.245881 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovs-rundir\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.245945 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82b72f68-40bd-4a81-914a-dda4355b9fc4-config\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.246007 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-combined-ca-bundle\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.246053 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.246141 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf5tk\" (UniqueName: \"kubernetes.io/projected/82b72f68-40bd-4a81-914a-dda4355b9fc4-kube-api-access-hf5tk\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.246350 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovs-rundir\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.247075 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovn-rundir\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.247481 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82b72f68-40bd-4a81-914a-dda4355b9fc4-config\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.254092 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.262950 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-combined-ca-bundle\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.267262 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf5tk\" (UniqueName: \"kubernetes.io/projected/82b72f68-40bd-4a81-914a-dda4355b9fc4-kube-api-access-hf5tk\") pod \"ovn-controller-metrics-8vf92\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.458095 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:49:45 crc kubenswrapper[5200]: I1126 13:49:45.791638 5200 generic.go:358] "Generic (PLEG): container finished" podID="9297d5cc-4467-41e9-961c-0b86e92deff8" containerID="acbe50c8399d11de4efad6d2497ddd488d54b33ac6ccc0c0d4fcef7abe234c29" exitCode=0 Nov 26 13:49:46 crc kubenswrapper[5200]: W1126 13:49:46.020618 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82b72f68_40bd_4a81_914a_dda4355b9fc4.slice/crio-ec2b94313c69a6266c3798bf759b2f31e170dc51201062d579193afea0c477b8 WatchSource:0}: Error finding container ec2b94313c69a6266c3798bf759b2f31e170dc51201062d579193afea0c477b8: Status 404 returned error can't find the container with id ec2b94313c69a6266c3798bf759b2f31e170dc51201062d579193afea0c477b8 Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.280807 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5fdfb5df-2jj2g"] Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.280891 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" event={"ID":"94740f8f-ad62-4ced-a015-703d5429f0a2","Type":"ContainerDied","Data":"4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7"} Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.280937 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.281257 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8vf92"] Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.281310 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbf879849-x97v9" event={"ID":"912eca56-ca63-4646-8546-98708ceaee05","Type":"ContainerDied","Data":"064864007fbe8b0913fee5d6b15af42d74f49411a218e0d2f184f9dee98627b3"} Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.281351 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb616b41-7836-4e58-a88b-bb68df018d24","Type":"ContainerStarted","Data":"ecbbbbf41072a038578703834cff83b26f4c2b1fa187c491036e99ce716e9272"} Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.281375 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50","Type":"ContainerStarted","Data":"0e841449e4eb0414395ea3facea2006a78302a8213608e2cdd3e15ba212e9e9a"} Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.281399 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" event={"ID":"9297d5cc-4467-41e9-961c-0b86e92deff8","Type":"ContainerDied","Data":"acbe50c8399d11de4efad6d2497ddd488d54b33ac6ccc0c0d4fcef7abe234c29"} Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.281486 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.289639 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovsdbserver-sb\"" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.379418 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-dns-svc\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.379671 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-config\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.380826 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.381002 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9tr\" (UniqueName: \"kubernetes.io/projected/798f83e1-037c-4d1f-a3e8-9b579183e07a-kube-api-access-9n9tr\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.482388 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-dns-svc\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.482477 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-config\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.482613 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.483620 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-dns-svc\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.483921 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.483921 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-config\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.484277 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9tr\" (UniqueName: \"kubernetes.io/projected/798f83e1-037c-4d1f-a3e8-9b579183e07a-kube-api-access-9n9tr\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.507303 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9tr\" (UniqueName: \"kubernetes.io/projected/798f83e1-037c-4d1f-a3e8-9b579183e07a-kube-api-access-9n9tr\") pod \"dnsmasq-dns-5c5fdfb5df-2jj2g\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.641355 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.688234 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/912eca56-ca63-4646-8546-98708ceaee05-config\") pod \"912eca56-ca63-4646-8546-98708ceaee05\" (UID: \"912eca56-ca63-4646-8546-98708ceaee05\") " Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.688436 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8s5q\" (UniqueName: \"kubernetes.io/projected/912eca56-ca63-4646-8546-98708ceaee05-kube-api-access-f8s5q\") pod \"912eca56-ca63-4646-8546-98708ceaee05\" (UID: \"912eca56-ca63-4646-8546-98708ceaee05\") " Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.705077 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/912eca56-ca63-4646-8546-98708ceaee05-kube-api-access-f8s5q" (OuterVolumeSpecName: "kube-api-access-f8s5q") pod "912eca56-ca63-4646-8546-98708ceaee05" (UID: "912eca56-ca63-4646-8546-98708ceaee05"). InnerVolumeSpecName "kube-api-access-f8s5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.724266 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/912eca56-ca63-4646-8546-98708ceaee05-config" (OuterVolumeSpecName: "config") pod "912eca56-ca63-4646-8546-98708ceaee05" (UID: "912eca56-ca63-4646-8546-98708ceaee05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.725152 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.791312 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/912eca56-ca63-4646-8546-98708ceaee05-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.791353 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8s5q\" (UniqueName: \"kubernetes.io/projected/912eca56-ca63-4646-8546-98708ceaee05-kube-api-access-f8s5q\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.805930 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbf879849-x97v9" event={"ID":"912eca56-ca63-4646-8546-98708ceaee05","Type":"ContainerDied","Data":"53754a07e960b3208447210cc9ca129b5050fe17ad4691a295db0a3509bd188a"} Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.806014 5200 scope.go:117] "RemoveContainer" containerID="064864007fbe8b0913fee5d6b15af42d74f49411a218e0d2f184f9dee98627b3" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.806213 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbf879849-x97v9" Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.818222 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8vf92" event={"ID":"82b72f68-40bd-4a81-914a-dda4355b9fc4","Type":"ContainerStarted","Data":"ec2b94313c69a6266c3798bf759b2f31e170dc51201062d579193afea0c477b8"} Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.924552 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-x97v9"] Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.942388 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-x97v9"] Nov 26 13:49:46 crc kubenswrapper[5200]: I1126 13:49:46.985177 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="912eca56-ca63-4646-8546-98708ceaee05" path="/var/lib/kubelet/pods/912eca56-ca63-4646-8546-98708ceaee05/volumes" Nov 26 13:49:47 crc kubenswrapper[5200]: I1126 13:49:47.265546 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5fdfb5df-2jj2g"] Nov 26 13:49:47 crc kubenswrapper[5200]: W1126 13:49:47.277670 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod798f83e1_037c_4d1f_a3e8_9b579183e07a.slice/crio-268f9f894d3976bc2adefc062c276b34328e371e2737c5217d046b7679dfa405 WatchSource:0}: Error finding container 268f9f894d3976bc2adefc062c276b34328e371e2737c5217d046b7679dfa405: Status 404 returned error can't find the container with id 268f9f894d3976bc2adefc062c276b34328e371e2737c5217d046b7679dfa405 Nov 26 13:49:47 crc kubenswrapper[5200]: I1126 13:49:47.832709 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" event={"ID":"94740f8f-ad62-4ced-a015-703d5429f0a2","Type":"ContainerStarted","Data":"0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced"} Nov 26 13:49:47 crc kubenswrapper[5200]: I1126 13:49:47.833211 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:47 crc kubenswrapper[5200]: I1126 13:49:47.843574 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" event={"ID":"798f83e1-037c-4d1f-a3e8-9b579183e07a","Type":"ContainerStarted","Data":"268f9f894d3976bc2adefc062c276b34328e371e2737c5217d046b7679dfa405"} Nov 26 13:49:47 crc kubenswrapper[5200]: I1126 13:49:47.846290 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5f5ea8d4-ba2b-4abe-b645-0513268421f9","Type":"ContainerStarted","Data":"76726c7914d31f34afba939283a29384999603044a931c1bcc9aba2f60a43f99"} Nov 26 13:49:47 crc kubenswrapper[5200]: I1126 13:49:47.849704 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c5390ed-d89e-4450-adf3-1363b1150d1b","Type":"ContainerStarted","Data":"fc8ecc37de36174c47bf0b036cad135c5bf8af671bf2e44d522e81e85324b227"} Nov 26 13:49:47 crc kubenswrapper[5200]: I1126 13:49:47.859998 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" podStartSLOduration=6.830831948 podStartE2EDuration="24.859971643s" podCreationTimestamp="2025-11-26 13:49:23 +0000 UTC" firstStartedPulling="2025-11-26 13:49:25.157175165 +0000 UTC m=+1133.836376983" lastFinishedPulling="2025-11-26 13:49:43.18631486 +0000 UTC m=+1151.865516678" observedRunningTime="2025-11-26 13:49:47.85227715 +0000 UTC m=+1156.531478988" watchObservedRunningTime="2025-11-26 13:49:47.859971643 +0000 UTC m=+1156.539173461" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.414932 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.537742 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-dns-svc\") pod \"9297d5cc-4467-41e9-961c-0b86e92deff8\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.537894 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qqdn\" (UniqueName: \"kubernetes.io/projected/9297d5cc-4467-41e9-961c-0b86e92deff8-kube-api-access-2qqdn\") pod \"9297d5cc-4467-41e9-961c-0b86e92deff8\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.537934 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-config\") pod \"9297d5cc-4467-41e9-961c-0b86e92deff8\" (UID: \"9297d5cc-4467-41e9-961c-0b86e92deff8\") " Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.550727 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9297d5cc-4467-41e9-961c-0b86e92deff8-kube-api-access-2qqdn" (OuterVolumeSpecName: "kube-api-access-2qqdn") pod "9297d5cc-4467-41e9-961c-0b86e92deff8" (UID: "9297d5cc-4467-41e9-961c-0b86e92deff8"). InnerVolumeSpecName "kube-api-access-2qqdn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.566629 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-config" (OuterVolumeSpecName: "config") pod "9297d5cc-4467-41e9-961c-0b86e92deff8" (UID: "9297d5cc-4467-41e9-961c-0b86e92deff8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.569420 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9297d5cc-4467-41e9-961c-0b86e92deff8" (UID: "9297d5cc-4467-41e9-961c-0b86e92deff8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.639966 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.640017 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qqdn\" (UniqueName: \"kubernetes.io/projected/9297d5cc-4467-41e9-961c-0b86e92deff8-kube-api-access-2qqdn\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.640029 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9297d5cc-4467-41e9-961c-0b86e92deff8-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.863821 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.864019 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-7s2td" event={"ID":"9297d5cc-4467-41e9-961c-0b86e92deff8","Type":"ContainerDied","Data":"42da4b5ecf0bb0b99c3a329434aa3e06a6b51586dc76dc80cf8fa5abef155074"} Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.864100 5200 scope.go:117] "RemoveContainer" containerID="acbe50c8399d11de4efad6d2497ddd488d54b33ac6ccc0c0d4fcef7abe234c29" Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.932189 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-7s2td"] Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.942457 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-7s2td"] Nov 26 13:49:48 crc kubenswrapper[5200]: I1126 13:49:48.983082 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9297d5cc-4467-41e9-961c-0b86e92deff8" path="/var/lib/kubelet/pods/9297d5cc-4467-41e9-961c-0b86e92deff8/volumes" Nov 26 13:49:53 crc kubenswrapper[5200]: I1126 13:49:53.867678 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.961569 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8vf92" event={"ID":"82b72f68-40bd-4a81-914a-dda4355b9fc4","Type":"ContainerStarted","Data":"bf61bc7bfe41e1213dbebfb173eaf919afb90cb614da617b3eb37e70e0d8b683"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.964714 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"937b077a-80f5-4f02-a636-68791a2cfe0f","Type":"ContainerStarted","Data":"dcdcd6e1ce754876345a56277736b1613ed38c8f5c5d0d53a8279dd09e133bb0"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.967431 5200 generic.go:358] "Generic (PLEG): container finished" podID="798f83e1-037c-4d1f-a3e8-9b579183e07a" containerID="b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d" exitCode=0 Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.967658 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" event={"ID":"798f83e1-037c-4d1f-a3e8-9b579183e07a","Type":"ContainerDied","Data":"b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.983455 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qk25n" event={"ID":"257070be-da5f-4601-b466-b9f7c8c8f153","Type":"ContainerStarted","Data":"f33740e7d3adadc613cd9ce218aeb6d42b611cbf434cfa017559867f048dc3ab"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.983821 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27a62b3a-c69a-47fa-9642-425931d81827","Type":"ContainerStarted","Data":"4314dc8e7e3ff2e877888334a17d2850ed752614b00e829594c4b4e7858e6db5"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.983912 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"20a99fd8-1db7-48a0-ad37-5652b13c7e00","Type":"ContainerStarted","Data":"2ce94c31dcd5188fa720de14e32b86ec905a5c470d231749cdd9917175af70e9"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.984000 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"55e57b83-77f4-446e-bc0a-c78f9094aacb","Type":"ContainerStarted","Data":"4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.984078 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/kube-state-metrics-0" Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.984155 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/memcached-0" Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.984229 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb616b41-7836-4e58-a88b-bb68df018d24","Type":"ContainerStarted","Data":"f5e60a029fe6708251acff52c633b481ec59ead862a791d6d3ca76ff77b39620"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.986083 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50","Type":"ContainerStarted","Data":"ad8e5c8ff258934df8b031e4ca400c00da84dc8e84950692fb247eb689629ffd"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.986151 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50","Type":"ContainerStarted","Data":"3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.989154 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kp5s8" event={"ID":"743192ea-7611-42dc-ab35-0dd584590479","Type":"ContainerStarted","Data":"7bb16d3a8f9b0cf1fc4e509932320bf940df760e4ccedec11d61d40970672efa"} Nov 26 13:49:56 crc kubenswrapper[5200]: I1126 13:49:56.989359 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-kp5s8" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.015963 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8vf92" podStartSLOduration=2.720426158 podStartE2EDuration="13.015935246s" podCreationTimestamp="2025-11-26 13:49:44 +0000 UTC" firstStartedPulling="2025-11-26 13:49:46.02628869 +0000 UTC m=+1154.705490508" lastFinishedPulling="2025-11-26 13:49:56.321797778 +0000 UTC m=+1165.000999596" observedRunningTime="2025-11-26 13:49:56.994445726 +0000 UTC m=+1165.673647554" watchObservedRunningTime="2025-11-26 13:49:57.015935246 +0000 UTC m=+1165.695137054" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.020208 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.012708351 podStartE2EDuration="30.020193169s" podCreationTimestamp="2025-11-26 13:49:27 +0000 UTC" firstStartedPulling="2025-11-26 13:49:43.285320075 +0000 UTC m=+1151.964521893" lastFinishedPulling="2025-11-26 13:49:55.292804883 +0000 UTC m=+1163.972006711" observedRunningTime="2025-11-26 13:49:57.015262178 +0000 UTC m=+1165.694463996" watchObservedRunningTime="2025-11-26 13:49:57.020193169 +0000 UTC m=+1165.699394987" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.071180 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kp5s8" podStartSLOduration=12.109517045 podStartE2EDuration="25.071161302s" podCreationTimestamp="2025-11-26 13:49:32 +0000 UTC" firstStartedPulling="2025-11-26 13:49:43.283409254 +0000 UTC m=+1151.962611072" lastFinishedPulling="2025-11-26 13:49:56.245053501 +0000 UTC m=+1164.924255329" observedRunningTime="2025-11-26 13:49:57.061238838 +0000 UTC m=+1165.740440656" watchObservedRunningTime="2025-11-26 13:49:57.071161302 +0000 UTC m=+1165.750363120" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.094785 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.654704185 podStartE2EDuration="21.094765308s" podCreationTimestamp="2025-11-26 13:49:36 +0000 UTC" firstStartedPulling="2025-11-26 13:49:43.725466238 +0000 UTC m=+1152.404668056" lastFinishedPulling="2025-11-26 13:49:56.165527351 +0000 UTC m=+1164.844729179" observedRunningTime="2025-11-26 13:49:57.080603622 +0000 UTC m=+1165.759805440" watchObservedRunningTime="2025-11-26 13:49:57.094765308 +0000 UTC m=+1165.773967116" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.154099 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.682192018 podStartE2EDuration="24.154080362s" podCreationTimestamp="2025-11-26 13:49:33 +0000 UTC" firstStartedPulling="2025-11-26 13:49:44.692293091 +0000 UTC m=+1153.371494909" lastFinishedPulling="2025-11-26 13:49:56.164181405 +0000 UTC m=+1164.843383253" observedRunningTime="2025-11-26 13:49:57.150480216 +0000 UTC m=+1165.829682034" watchObservedRunningTime="2025-11-26 13:49:57.154080362 +0000 UTC m=+1165.833282180" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.271381 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.630737931 podStartE2EDuration="27.271356064s" podCreationTimestamp="2025-11-26 13:49:30 +0000 UTC" firstStartedPulling="2025-11-26 13:49:42.6602765 +0000 UTC m=+1151.339478358" lastFinishedPulling="2025-11-26 13:49:56.300894673 +0000 UTC m=+1164.980096491" observedRunningTime="2025-11-26 13:49:57.182303301 +0000 UTC m=+1165.861505119" watchObservedRunningTime="2025-11-26 13:49:57.271356064 +0000 UTC m=+1165.950557882" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.401324 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5fdfb5df-2jj2g"] Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.473622 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d8564ff-2dlq2"] Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.485094 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="912eca56-ca63-4646-8546-98708ceaee05" containerName="init" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.485136 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="912eca56-ca63-4646-8546-98708ceaee05" containerName="init" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.485177 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9297d5cc-4467-41e9-961c-0b86e92deff8" containerName="init" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.485183 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9297d5cc-4467-41e9-961c-0b86e92deff8" containerName="init" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.485388 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9297d5cc-4467-41e9-961c-0b86e92deff8" containerName="init" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.485411 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="912eca56-ca63-4646-8546-98708ceaee05" containerName="init" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.498180 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.501326 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovsdbserver-nb\"" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.510080 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d8564ff-2dlq2"] Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.551092 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-nb\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.551168 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-config\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.551277 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-sb\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.551303 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv9x7\" (UniqueName: \"kubernetes.io/projected/9a0b98f9-f27d-429a-a8d0-058f59c8272c-kube-api-access-lv9x7\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.551328 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-dns-svc\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.653194 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-sb\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.653762 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv9x7\" (UniqueName: \"kubernetes.io/projected/9a0b98f9-f27d-429a-a8d0-058f59c8272c-kube-api-access-lv9x7\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.653807 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-dns-svc\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.653841 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-nb\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.653913 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-config\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.654303 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-sb\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.654901 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-nb\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.655079 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-config\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.655389 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-dns-svc\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.676168 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv9x7\" (UniqueName: \"kubernetes.io/projected/9a0b98f9-f27d-429a-a8d0-058f59c8272c-kube-api-access-lv9x7\") pod \"dnsmasq-dns-7d8564ff-2dlq2\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:57 crc kubenswrapper[5200]: I1126 13:49:57.842612 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.008899 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" event={"ID":"798f83e1-037c-4d1f-a3e8-9b579183e07a","Type":"ContainerStarted","Data":"39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d"} Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.009100 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" podUID="798f83e1-037c-4d1f-a3e8-9b579183e07a" containerName="dnsmasq-dns" containerID="cri-o://39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d" gracePeriod=10 Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.009599 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.014117 5200 generic.go:358] "Generic (PLEG): container finished" podID="257070be-da5f-4601-b466-b9f7c8c8f153" containerID="f33740e7d3adadc613cd9ce218aeb6d42b611cbf434cfa017559867f048dc3ab" exitCode=0 Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.014881 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qk25n" event={"ID":"257070be-da5f-4601-b466-b9f7c8c8f153","Type":"ContainerDied","Data":"f33740e7d3adadc613cd9ce218aeb6d42b611cbf434cfa017559867f048dc3ab"} Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.019425 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb616b41-7836-4e58-a88b-bb68df018d24","Type":"ContainerStarted","Data":"76f45a7a6ec5de6fe10745e16931f137f558e7423c3c48a60e8dae410a4c952b"} Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.049756 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" podStartSLOduration=14.049728097 podStartE2EDuration="14.049728097s" podCreationTimestamp="2025-11-26 13:49:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:49:58.031137493 +0000 UTC m=+1166.710339331" watchObservedRunningTime="2025-11-26 13:49:58.049728097 +0000 UTC m=+1166.728929915" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.351893 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d8564ff-2dlq2"] Nov 26 13:49:58 crc kubenswrapper[5200]: W1126 13:49:58.358708 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a0b98f9_f27d_429a_a8d0_058f59c8272c.slice/crio-1466484d34e3f8a3be3fd371fa2de2639bfe606eba513c2fe84087cbb1f06f7f WatchSource:0}: Error finding container 1466484d34e3f8a3be3fd371fa2de2639bfe606eba513c2fe84087cbb1f06f7f: Status 404 returned error can't find the container with id 1466484d34e3f8a3be3fd371fa2de2639bfe606eba513c2fe84087cbb1f06f7f Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.492833 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.507886 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.578783 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-dns-svc\") pod \"798f83e1-037c-4d1f-a3e8-9b579183e07a\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.579123 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n9tr\" (UniqueName: \"kubernetes.io/projected/798f83e1-037c-4d1f-a3e8-9b579183e07a-kube-api-access-9n9tr\") pod \"798f83e1-037c-4d1f-a3e8-9b579183e07a\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.579194 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-config\") pod \"798f83e1-037c-4d1f-a3e8-9b579183e07a\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.579660 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-ovsdbserver-sb\") pod \"798f83e1-037c-4d1f-a3e8-9b579183e07a\" (UID: \"798f83e1-037c-4d1f-a3e8-9b579183e07a\") " Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.597256 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798f83e1-037c-4d1f-a3e8-9b579183e07a-kube-api-access-9n9tr" (OuterVolumeSpecName: "kube-api-access-9n9tr") pod "798f83e1-037c-4d1f-a3e8-9b579183e07a" (UID: "798f83e1-037c-4d1f-a3e8-9b579183e07a"). InnerVolumeSpecName "kube-api-access-9n9tr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.649613 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-config" (OuterVolumeSpecName: "config") pod "798f83e1-037c-4d1f-a3e8-9b579183e07a" (UID: "798f83e1-037c-4d1f-a3e8-9b579183e07a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.654770 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "798f83e1-037c-4d1f-a3e8-9b579183e07a" (UID: "798f83e1-037c-4d1f-a3e8-9b579183e07a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.678305 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "798f83e1-037c-4d1f-a3e8-9b579183e07a" (UID: "798f83e1-037c-4d1f-a3e8-9b579183e07a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.682020 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9n9tr\" (UniqueName: \"kubernetes.io/projected/798f83e1-037c-4d1f-a3e8-9b579183e07a-kube-api-access-9n9tr\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.682062 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.682074 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:58 crc kubenswrapper[5200]: I1126 13:49:58.682083 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/798f83e1-037c-4d1f-a3e8-9b579183e07a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.032092 5200 generic.go:358] "Generic (PLEG): container finished" podID="798f83e1-037c-4d1f-a3e8-9b579183e07a" containerID="39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d" exitCode=0 Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.032631 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" event={"ID":"798f83e1-037c-4d1f-a3e8-9b579183e07a","Type":"ContainerDied","Data":"39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d"} Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.032668 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" event={"ID":"798f83e1-037c-4d1f-a3e8-9b579183e07a","Type":"ContainerDied","Data":"268f9f894d3976bc2adefc062c276b34328e371e2737c5217d046b7679dfa405"} Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.032710 5200 scope.go:117] "RemoveContainer" containerID="39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.032872 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5fdfb5df-2jj2g" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.037032 5200 generic.go:358] "Generic (PLEG): container finished" podID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" containerID="ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e" exitCode=0 Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.037396 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" event={"ID":"9a0b98f9-f27d-429a-a8d0-058f59c8272c","Type":"ContainerDied","Data":"ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e"} Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.037460 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" event={"ID":"9a0b98f9-f27d-429a-a8d0-058f59c8272c","Type":"ContainerStarted","Data":"1466484d34e3f8a3be3fd371fa2de2639bfe606eba513c2fe84087cbb1f06f7f"} Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.051551 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qk25n" event={"ID":"257070be-da5f-4601-b466-b9f7c8c8f153","Type":"ContainerStarted","Data":"7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a"} Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.051618 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qk25n" event={"ID":"257070be-da5f-4601-b466-b9f7c8c8f153","Type":"ContainerStarted","Data":"f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b"} Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.077985 5200 scope.go:117] "RemoveContainer" containerID="b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.102326 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qk25n" podStartSLOduration=14.855527577 podStartE2EDuration="27.102301126s" podCreationTimestamp="2025-11-26 13:49:32 +0000 UTC" firstStartedPulling="2025-11-26 13:49:43.689847288 +0000 UTC m=+1152.369049106" lastFinishedPulling="2025-11-26 13:49:55.936620817 +0000 UTC m=+1164.615822655" observedRunningTime="2025-11-26 13:49:59.092039854 +0000 UTC m=+1167.771241712" watchObservedRunningTime="2025-11-26 13:49:59.102301126 +0000 UTC m=+1167.781502944" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.112182 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.123661 5200 scope.go:117] "RemoveContainer" containerID="39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d" Nov 26 13:49:59 crc kubenswrapper[5200]: E1126 13:49:59.125949 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d\": container with ID starting with 39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d not found: ID does not exist" containerID="39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.126014 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d"} err="failed to get container status \"39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d\": rpc error: code = NotFound desc = could not find container \"39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d\": container with ID starting with 39411292c8987b415b2eb40bfd40c7008a3b7e07d513115e0f2cd77289e9100d not found: ID does not exist" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.126058 5200 scope.go:117] "RemoveContainer" containerID="b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d" Nov 26 13:49:59 crc kubenswrapper[5200]: E1126 13:49:59.126391 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d\": container with ID starting with b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d not found: ID does not exist" containerID="b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.126491 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d"} err="failed to get container status \"b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d\": rpc error: code = NotFound desc = could not find container \"b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d\": container with ID starting with b4241f58da071ed910e6c4714b6758179d9f7e8f8cd8fd594d144fb3b8133d9d not found: ID does not exist" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.156576 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5fdfb5df-2jj2g"] Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.165770 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5fdfb5df-2jj2g"] Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.483466 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.485901 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-nb-0" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.505678 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 26 13:49:59 crc kubenswrapper[5200]: I1126 13:49:59.576004 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 26 13:50:00 crc kubenswrapper[5200]: I1126 13:50:00.066982 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" event={"ID":"9a0b98f9-f27d-429a-a8d0-058f59c8272c","Type":"ContainerStarted","Data":"47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458"} Nov 26 13:50:00 crc kubenswrapper[5200]: I1126 13:50:00.067549 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:50:00 crc kubenswrapper[5200]: I1126 13:50:00.067655 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:50:00 crc kubenswrapper[5200]: I1126 13:50:00.067706 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:50:00 crc kubenswrapper[5200]: I1126 13:50:00.102619 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" podStartSLOduration=3.102586289 podStartE2EDuration="3.102586289s" podCreationTimestamp="2025-11-26 13:49:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:00.091615688 +0000 UTC m=+1168.770817506" watchObservedRunningTime="2025-11-26 13:50:00.102586289 +0000 UTC m=+1168.781788157" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:00.980869 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798f83e1-037c-4d1f-a3e8-9b579183e07a" path="/var/lib/kubelet/pods/798f83e1-037c-4d1f-a3e8-9b579183e07a/volumes" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.121043 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.149336 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.405797 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.407568 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="798f83e1-037c-4d1f-a3e8-9b579183e07a" containerName="init" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.407595 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="798f83e1-037c-4d1f-a3e8-9b579183e07a" containerName="init" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.407673 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="798f83e1-037c-4d1f-a3e8-9b579183e07a" containerName="dnsmasq-dns" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.407697 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="798f83e1-037c-4d1f-a3e8-9b579183e07a" containerName="dnsmasq-dns" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.407916 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="798f83e1-037c-4d1f-a3e8-9b579183e07a" containerName="dnsmasq-dns" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.430739 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.430862 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.433308 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovnnorthd-ovnnorthd-dockercfg-s5fd4\"" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.434908 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovnnorthd-scripts\"" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.436996 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovnnorthd-ovndbs\"" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.437275 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovnnorthd-config\"" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.453051 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-config\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.453525 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-scripts\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.453590 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.453609 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.453645 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.453678 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/12479106-7c82-418c-b017-03e5fcc4f018-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.453753 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccmg\" (UniqueName: \"kubernetes.io/projected/12479106-7c82-418c-b017-03e5fcc4f018-kube-api-access-5ccmg\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.555753 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.555843 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/12479106-7c82-418c-b017-03e5fcc4f018-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.555942 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccmg\" (UniqueName: \"kubernetes.io/projected/12479106-7c82-418c-b017-03e5fcc4f018-kube-api-access-5ccmg\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.556028 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-config\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.556094 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-scripts\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.556767 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/12479106-7c82-418c-b017-03e5fcc4f018-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.557204 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-config\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.556558 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.557281 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.557399 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-scripts\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.564817 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.565643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.566046 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.580654 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccmg\" (UniqueName: \"kubernetes.io/projected/12479106-7c82-418c-b017-03e5fcc4f018-kube-api-access-5ccmg\") pod \"ovn-northd-0\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " pod="openstack/ovn-northd-0" Nov 26 13:50:01 crc kubenswrapper[5200]: I1126 13:50:01.748837 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 13:50:02 crc kubenswrapper[5200]: I1126 13:50:02.256359 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 13:50:03 crc kubenswrapper[5200]: I1126 13:50:03.022993 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 26 13:50:03 crc kubenswrapper[5200]: I1126 13:50:03.097669 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"12479106-7c82-418c-b017-03e5fcc4f018","Type":"ContainerStarted","Data":"6fad5f9ac71a608927bf373e740c42fd1061b54d5ef59a0fb256f2b53ad1b2d8"} Nov 26 13:50:03 crc kubenswrapper[5200]: I1126 13:50:03.154877 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:50:03 crc kubenswrapper[5200]: I1126 13:50:03.155025 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:50:04 crc kubenswrapper[5200]: I1126 13:50:04.114227 5200 generic.go:358] "Generic (PLEG): container finished" podID="937b077a-80f5-4f02-a636-68791a2cfe0f" containerID="dcdcd6e1ce754876345a56277736b1613ed38c8f5c5d0d53a8279dd09e133bb0" exitCode=0 Nov 26 13:50:04 crc kubenswrapper[5200]: I1126 13:50:04.114344 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"937b077a-80f5-4f02-a636-68791a2cfe0f","Type":"ContainerDied","Data":"dcdcd6e1ce754876345a56277736b1613ed38c8f5c5d0d53a8279dd09e133bb0"} Nov 26 13:50:05 crc kubenswrapper[5200]: I1126 13:50:05.124518 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"12479106-7c82-418c-b017-03e5fcc4f018","Type":"ContainerStarted","Data":"5a4f406803509ed4c9f81a25d1d85f51801ee9e253c61023a592d2368060e342"} Nov 26 13:50:05 crc kubenswrapper[5200]: I1126 13:50:05.125302 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"12479106-7c82-418c-b017-03e5fcc4f018","Type":"ContainerStarted","Data":"70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141"} Nov 26 13:50:05 crc kubenswrapper[5200]: I1126 13:50:05.125322 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-northd-0" Nov 26 13:50:05 crc kubenswrapper[5200]: I1126 13:50:05.127449 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"937b077a-80f5-4f02-a636-68791a2cfe0f","Type":"ContainerStarted","Data":"8eb763e1a24e243fecca1c957e0644829929333388f6e4d32ab9411ef66f8cf1"} Nov 26 13:50:05 crc kubenswrapper[5200]: I1126 13:50:05.149524 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8490167629999998 podStartE2EDuration="4.149504126s" podCreationTimestamp="2025-11-26 13:50:01 +0000 UTC" firstStartedPulling="2025-11-26 13:50:02.276119782 +0000 UTC m=+1170.955321600" lastFinishedPulling="2025-11-26 13:50:04.576607145 +0000 UTC m=+1173.255808963" observedRunningTime="2025-11-26 13:50:05.143325292 +0000 UTC m=+1173.822527100" watchObservedRunningTime="2025-11-26 13:50:05.149504126 +0000 UTC m=+1173.828705944" Nov 26 13:50:05 crc kubenswrapper[5200]: I1126 13:50:05.183941 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.210911482 podStartE2EDuration="39.183903649s" podCreationTimestamp="2025-11-26 13:49:26 +0000 UTC" firstStartedPulling="2025-11-26 13:49:43.272152417 +0000 UTC m=+1151.951354235" lastFinishedPulling="2025-11-26 13:49:56.245144574 +0000 UTC m=+1164.924346402" observedRunningTime="2025-11-26 13:50:05.166888977 +0000 UTC m=+1173.846090805" watchObservedRunningTime="2025-11-26 13:50:05.183903649 +0000 UTC m=+1173.863105477" Nov 26 13:50:06 crc kubenswrapper[5200]: I1126 13:50:06.150508 5200 generic.go:358] "Generic (PLEG): container finished" podID="27a62b3a-c69a-47fa-9642-425931d81827" containerID="4314dc8e7e3ff2e877888334a17d2850ed752614b00e829594c4b4e7858e6db5" exitCode=0 Nov 26 13:50:06 crc kubenswrapper[5200]: I1126 13:50:06.150582 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27a62b3a-c69a-47fa-9642-425931d81827","Type":"ContainerDied","Data":"4314dc8e7e3ff2e877888334a17d2850ed752614b00e829594c4b4e7858e6db5"} Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.087984 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.171959 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27a62b3a-c69a-47fa-9642-425931d81827","Type":"ContainerStarted","Data":"9d12f78cd05b747844b762871d784048deb6a46596cfe397348c7103574c5d30"} Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.241215 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-pmsfs"] Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.241588 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" podUID="94740f8f-ad62-4ced-a015-703d5429f0a2" containerName="dnsmasq-dns" containerID="cri-o://0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced" gracePeriod=10 Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.256773 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.505994643 podStartE2EDuration="42.256743681s" podCreationTimestamp="2025-11-26 13:49:25 +0000 UTC" firstStartedPulling="2025-11-26 13:49:43.185736185 +0000 UTC m=+1151.864938003" lastFinishedPulling="2025-11-26 13:49:55.936485223 +0000 UTC m=+1164.615687041" observedRunningTime="2025-11-26 13:50:07.220816068 +0000 UTC m=+1175.900017906" watchObservedRunningTime="2025-11-26 13:50:07.256743681 +0000 UTC m=+1175.935945499" Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.751556 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.907712 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-config\") pod \"94740f8f-ad62-4ced-a015-703d5429f0a2\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.907823 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbf29\" (UniqueName: \"kubernetes.io/projected/94740f8f-ad62-4ced-a015-703d5429f0a2-kube-api-access-tbf29\") pod \"94740f8f-ad62-4ced-a015-703d5429f0a2\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.907925 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-dns-svc\") pod \"94740f8f-ad62-4ced-a015-703d5429f0a2\" (UID: \"94740f8f-ad62-4ced-a015-703d5429f0a2\") " Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.917077 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94740f8f-ad62-4ced-a015-703d5429f0a2-kube-api-access-tbf29" (OuterVolumeSpecName: "kube-api-access-tbf29") pod "94740f8f-ad62-4ced-a015-703d5429f0a2" (UID: "94740f8f-ad62-4ced-a015-703d5429f0a2"). InnerVolumeSpecName "kube-api-access-tbf29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.970519 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-config" (OuterVolumeSpecName: "config") pod "94740f8f-ad62-4ced-a015-703d5429f0a2" (UID: "94740f8f-ad62-4ced-a015-703d5429f0a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:07 crc kubenswrapper[5200]: I1126 13:50:07.976151 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "94740f8f-ad62-4ced-a015-703d5429f0a2" (UID: "94740f8f-ad62-4ced-a015-703d5429f0a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.009662 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.010557 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbf29\" (UniqueName: \"kubernetes.io/projected/94740f8f-ad62-4ced-a015-703d5429f0a2-kube-api-access-tbf29\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.010574 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94740f8f-ad62-4ced-a015-703d5429f0a2-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.024988 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.186191 5200 generic.go:358] "Generic (PLEG): container finished" podID="94740f8f-ad62-4ced-a015-703d5429f0a2" containerID="0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced" exitCode=0 Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.186383 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" event={"ID":"94740f8f-ad62-4ced-a015-703d5429f0a2","Type":"ContainerDied","Data":"0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced"} Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.187296 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" event={"ID":"94740f8f-ad62-4ced-a015-703d5429f0a2","Type":"ContainerDied","Data":"559e1d16dfc3c0238b3d6cc3bb68e97028015c2dd74e3721572d72fb876958a5"} Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.187343 5200 scope.go:117] "RemoveContainer" containerID="0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.186539 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6686bbb8b9-pmsfs" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.220221 5200 scope.go:117] "RemoveContainer" containerID="4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.224550 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-pmsfs"] Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.231579 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-pmsfs"] Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.242918 5200 scope.go:117] "RemoveContainer" containerID="0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced" Nov 26 13:50:08 crc kubenswrapper[5200]: E1126 13:50:08.243626 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced\": container with ID starting with 0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced not found: ID does not exist" containerID="0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.243714 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced"} err="failed to get container status \"0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced\": rpc error: code = NotFound desc = could not find container \"0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced\": container with ID starting with 0ead6c78eadfc156ec129cc28ae08e70db676d38bdd9a8ed911e62a5022a5ced not found: ID does not exist" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.243756 5200 scope.go:117] "RemoveContainer" containerID="4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7" Nov 26 13:50:08 crc kubenswrapper[5200]: E1126 13:50:08.244224 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7\": container with ID starting with 4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7 not found: ID does not exist" containerID="4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.244264 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7"} err="failed to get container status \"4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7\": rpc error: code = NotFound desc = could not find container \"4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7\": container with ID starting with 4ad0e58431793d805b6ef973b311e305e8630016998d534941fb59003eb6c8c7 not found: ID does not exist" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.395616 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.396093 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/openstack-cell1-galera-0" Nov 26 13:50:08 crc kubenswrapper[5200]: I1126 13:50:08.982020 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94740f8f-ad62-4ced-a015-703d5429f0a2" path="/var/lib/kubelet/pods/94740f8f-ad62-4ced-a015-703d5429f0a2/volumes" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.613316 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fb6f9b497-2587r"] Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.615172 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94740f8f-ad62-4ced-a015-703d5429f0a2" containerName="init" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.615579 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="94740f8f-ad62-4ced-a015-703d5429f0a2" containerName="init" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.615623 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94740f8f-ad62-4ced-a015-703d5429f0a2" containerName="dnsmasq-dns" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.615630 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="94740f8f-ad62-4ced-a015-703d5429f0a2" containerName="dnsmasq-dns" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.615900 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="94740f8f-ad62-4ced-a015-703d5429f0a2" containerName="dnsmasq-dns" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.741643 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb6f9b497-2587r"] Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.741878 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.817045 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.871248 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-dns-svc\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.871311 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-config\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.871375 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwf6g\" (UniqueName: \"kubernetes.io/projected/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-kube-api-access-gwf6g\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.871426 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-sb\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.871468 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-nb\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.924826 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.973841 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-dns-svc\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.973969 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-config\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.974027 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwf6g\" (UniqueName: \"kubernetes.io/projected/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-kube-api-access-gwf6g\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.974786 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-sb\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.975807 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-config\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.975836 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-nb\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.976440 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-nb\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.975205 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-dns-svc\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:10 crc kubenswrapper[5200]: I1126 13:50:10.975619 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-sb\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.000016 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwf6g\" (UniqueName: \"kubernetes.io/projected/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-kube-api-access-gwf6g\") pod \"dnsmasq-dns-5fb6f9b497-2587r\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.061160 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.638532 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb6f9b497-2587r"] Nov 26 13:50:11 crc kubenswrapper[5200]: W1126 13:50:11.662391 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59a26d9f_edf9_4a31_b95a_d4a0894a7ba2.slice/crio-58d50599bdced46d86ad26d090f80ed97666580fa5ae2106b87be80f0121b7c5 WatchSource:0}: Error finding container 58d50599bdced46d86ad26d090f80ed97666580fa5ae2106b87be80f0121b7c5: Status 404 returned error can't find the container with id 58d50599bdced46d86ad26d090f80ed97666580fa5ae2106b87be80f0121b7c5 Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.668975 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.687942 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.692665 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"swift-storage-config-data\"" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.692932 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"swift-conf\"" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.693474 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"swift-swift-dockercfg-ggxcv\"" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.693644 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"swift-ring-files\"" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.708118 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.804047 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.804126 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq8rq\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-kube-api-access-pq8rq\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.804164 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.804295 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-cache\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.804635 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-lock\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.837976 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wn5kl"] Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.848440 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.851018 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"swift-proxy-config-data\"" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.851418 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"swift-ring-scripts\"" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.857917 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wn5kl"] Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.865175 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"swift-ring-config-data\"" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.906802 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pq8rq\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-kube-api-access-pq8rq\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.906867 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.906923 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-cache\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.906981 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-lock\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.907000 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.907391 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.907659 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-cache\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.907977 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-lock\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: E1126 13:50:11.908520 5200 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 13:50:11 crc kubenswrapper[5200]: E1126 13:50:11.908549 5200 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 13:50:11 crc kubenswrapper[5200]: E1126 13:50:11.908627 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift podName:9e8c608b-5148-492a-a8a7-b8eccc586561 nodeName:}" failed. No retries permitted until 2025-11-26 13:50:12.408601815 +0000 UTC m=+1181.087803633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift") pod "swift-storage-0" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561") : configmap "swift-ring-files" not found Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.927547 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq8rq\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-kube-api-access-pq8rq\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:11 crc kubenswrapper[5200]: I1126 13:50:11.933050 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.009081 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6c4r\" (UniqueName: \"kubernetes.io/projected/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-kube-api-access-c6c4r\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.009395 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-ring-data-devices\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.009475 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-dispersionconf\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.009569 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-etc-swift\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.009640 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-combined-ca-bundle\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.009758 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-scripts\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.010124 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-swiftconf\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.111915 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-etc-swift\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.112392 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-combined-ca-bundle\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.112472 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-etc-swift\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.112521 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-scripts\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.112860 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-swiftconf\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.113215 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6c4r\" (UniqueName: \"kubernetes.io/projected/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-kube-api-access-c6c4r\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.113341 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-ring-data-devices\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.113372 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-dispersionconf\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.113394 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-scripts\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.114097 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-ring-data-devices\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.118239 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-dispersionconf\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.118909 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-combined-ca-bundle\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.119266 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-swiftconf\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.134227 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6c4r\" (UniqueName: \"kubernetes.io/projected/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-kube-api-access-c6c4r\") pod \"swift-ring-rebalance-wn5kl\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.181842 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.240833 5200 generic.go:358] "Generic (PLEG): container finished" podID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" containerID="c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b" exitCode=0 Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.240924 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" event={"ID":"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2","Type":"ContainerDied","Data":"c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b"} Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.241001 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" event={"ID":"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2","Type":"ContainerStarted","Data":"58d50599bdced46d86ad26d090f80ed97666580fa5ae2106b87be80f0121b7c5"} Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.419843 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:12 crc kubenswrapper[5200]: E1126 13:50:12.420408 5200 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 13:50:12 crc kubenswrapper[5200]: E1126 13:50:12.421006 5200 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 13:50:12 crc kubenswrapper[5200]: E1126 13:50:12.421104 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift podName:9e8c608b-5148-492a-a8a7-b8eccc586561 nodeName:}" failed. No retries permitted until 2025-11-26 13:50:13.421076163 +0000 UTC m=+1182.100277991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift") pod "swift-storage-0" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561") : configmap "swift-ring-files" not found Nov 26 13:50:12 crc kubenswrapper[5200]: I1126 13:50:12.801841 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wn5kl"] Nov 26 13:50:12 crc kubenswrapper[5200]: W1126 13:50:12.803278 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4a8a0f1_3a2d_47ec_84a9_042b56f2d0d6.slice/crio-c23499f6fa04ab52a4512c676cafea5cb7c2ffc03d559e9236fa0461624404f3 WatchSource:0}: Error finding container c23499f6fa04ab52a4512c676cafea5cb7c2ffc03d559e9236fa0461624404f3: Status 404 returned error can't find the container with id c23499f6fa04ab52a4512c676cafea5cb7c2ffc03d559e9236fa0461624404f3 Nov 26 13:50:13 crc kubenswrapper[5200]: I1126 13:50:13.270406 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" event={"ID":"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2","Type":"ContainerStarted","Data":"590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78"} Nov 26 13:50:13 crc kubenswrapper[5200]: I1126 13:50:13.270562 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:13 crc kubenswrapper[5200]: I1126 13:50:13.274142 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wn5kl" event={"ID":"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6","Type":"ContainerStarted","Data":"c23499f6fa04ab52a4512c676cafea5cb7c2ffc03d559e9236fa0461624404f3"} Nov 26 13:50:13 crc kubenswrapper[5200]: I1126 13:50:13.300990 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" podStartSLOduration=3.300960371 podStartE2EDuration="3.300960371s" podCreationTimestamp="2025-11-26 13:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:13.295201828 +0000 UTC m=+1181.974403666" watchObservedRunningTime="2025-11-26 13:50:13.300960371 +0000 UTC m=+1181.980162199" Nov 26 13:50:13 crc kubenswrapper[5200]: I1126 13:50:13.444863 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:13 crc kubenswrapper[5200]: E1126 13:50:13.445118 5200 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 13:50:13 crc kubenswrapper[5200]: E1126 13:50:13.445152 5200 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 13:50:13 crc kubenswrapper[5200]: E1126 13:50:13.445277 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift podName:9e8c608b-5148-492a-a8a7-b8eccc586561 nodeName:}" failed. No retries permitted until 2025-11-26 13:50:15.445243259 +0000 UTC m=+1184.124445117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift") pod "swift-storage-0" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561") : configmap "swift-ring-files" not found Nov 26 13:50:15 crc kubenswrapper[5200]: I1126 13:50:15.096528 5200 pod_container_manager_linux.go:217] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod8f127dad-fe57-4c5f-9029-b884f62ea84b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod8f127dad-fe57-4c5f-9029-b884f62ea84b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8f127dad_fe57_4c5f_9029_b884f62ea84b.slice" Nov 26 13:50:15 crc kubenswrapper[5200]: E1126 13:50:15.097346 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod8f127dad-fe57-4c5f-9029-b884f62ea84b] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod8f127dad-fe57-4c5f-9029-b884f62ea84b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8f127dad_fe57_4c5f_9029_b884f62ea84b.slice" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" podUID="8f127dad-fe57-4c5f-9029-b884f62ea84b" Nov 26 13:50:15 crc kubenswrapper[5200]: I1126 13:50:15.295345 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-6khkp" Nov 26 13:50:15 crc kubenswrapper[5200]: I1126 13:50:15.421288 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-6khkp"] Nov 26 13:50:15 crc kubenswrapper[5200]: I1126 13:50:15.434249 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-6khkp"] Nov 26 13:50:15 crc kubenswrapper[5200]: I1126 13:50:15.494447 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:15 crc kubenswrapper[5200]: E1126 13:50:15.494822 5200 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 13:50:15 crc kubenswrapper[5200]: E1126 13:50:15.494891 5200 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 13:50:15 crc kubenswrapper[5200]: E1126 13:50:15.495011 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift podName:9e8c608b-5148-492a-a8a7-b8eccc586561 nodeName:}" failed. No retries permitted until 2025-11-26 13:50:19.494976207 +0000 UTC m=+1188.174178065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift") pod "swift-storage-0" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561") : configmap "swift-ring-files" not found Nov 26 13:50:16 crc kubenswrapper[5200]: I1126 13:50:16.225616 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 26 13:50:16 crc kubenswrapper[5200]: I1126 13:50:16.699537 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/openstack-galera-0" Nov 26 13:50:16 crc kubenswrapper[5200]: I1126 13:50:16.700017 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 26 13:50:16 crc kubenswrapper[5200]: I1126 13:50:16.932463 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 26 13:50:16 crc kubenswrapper[5200]: I1126 13:50:16.982161 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f127dad-fe57-4c5f-9029-b884f62ea84b" path="/var/lib/kubelet/pods/8f127dad-fe57-4c5f-9029-b884f62ea84b/volumes" Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.326770 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wn5kl" event={"ID":"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6","Type":"ContainerStarted","Data":"7a8141c0c332f8af8b48e4fd35e84bf7b0c089845033f6d2d80786d4daaee498"} Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.361523 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wn5kl" podStartSLOduration=2.330284999 podStartE2EDuration="6.361498945s" podCreationTimestamp="2025-11-26 13:50:11 +0000 UTC" firstStartedPulling="2025-11-26 13:50:12.807753914 +0000 UTC m=+1181.486955772" lastFinishedPulling="2025-11-26 13:50:16.83896789 +0000 UTC m=+1185.518169718" observedRunningTime="2025-11-26 13:50:17.356238445 +0000 UTC m=+1186.035440303" watchObservedRunningTime="2025-11-26 13:50:17.361498945 +0000 UTC m=+1186.040700753" Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.453251 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.927491 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-bf07-account-create-update-97k2v"] Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.943743 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bf07-account-create-update-97k2v"] Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.943971 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.947444 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-db-secret\"" Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.952170 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e804f8-39b6-4544-8117-3bb9c337973b-operator-scripts\") pod \"keystone-bf07-account-create-update-97k2v\" (UID: \"d9e804f8-39b6-4544-8117-3bb9c337973b\") " pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.952339 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpz7p\" (UniqueName: \"kubernetes.io/projected/d9e804f8-39b6-4544-8117-3bb9c337973b-kube-api-access-xpz7p\") pod \"keystone-bf07-account-create-update-97k2v\" (UID: \"d9e804f8-39b6-4544-8117-3bb9c337973b\") " pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.977042 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bwjrj"] Nov 26 13:50:17 crc kubenswrapper[5200]: I1126 13:50:17.988581 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.009210 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bwjrj"] Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.057029 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e804f8-39b6-4544-8117-3bb9c337973b-operator-scripts\") pod \"keystone-bf07-account-create-update-97k2v\" (UID: \"d9e804f8-39b6-4544-8117-3bb9c337973b\") " pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.057186 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c642182e-867b-4579-9289-0347c94a1e0a-operator-scripts\") pod \"keystone-db-create-bwjrj\" (UID: \"c642182e-867b-4579-9289-0347c94a1e0a\") " pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.057305 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ffd\" (UniqueName: \"kubernetes.io/projected/c642182e-867b-4579-9289-0347c94a1e0a-kube-api-access-z5ffd\") pod \"keystone-db-create-bwjrj\" (UID: \"c642182e-867b-4579-9289-0347c94a1e0a\") " pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.057599 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpz7p\" (UniqueName: \"kubernetes.io/projected/d9e804f8-39b6-4544-8117-3bb9c337973b-kube-api-access-xpz7p\") pod \"keystone-bf07-account-create-update-97k2v\" (UID: \"d9e804f8-39b6-4544-8117-3bb9c337973b\") " pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.058897 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e804f8-39b6-4544-8117-3bb9c337973b-operator-scripts\") pod \"keystone-bf07-account-create-update-97k2v\" (UID: \"d9e804f8-39b6-4544-8117-3bb9c337973b\") " pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.078855 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpz7p\" (UniqueName: \"kubernetes.io/projected/d9e804f8-39b6-4544-8117-3bb9c337973b-kube-api-access-xpz7p\") pod \"keystone-bf07-account-create-update-97k2v\" (UID: \"d9e804f8-39b6-4544-8117-3bb9c337973b\") " pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.160735 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c642182e-867b-4579-9289-0347c94a1e0a-operator-scripts\") pod \"keystone-db-create-bwjrj\" (UID: \"c642182e-867b-4579-9289-0347c94a1e0a\") " pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.160846 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ffd\" (UniqueName: \"kubernetes.io/projected/c642182e-867b-4579-9289-0347c94a1e0a-kube-api-access-z5ffd\") pod \"keystone-db-create-bwjrj\" (UID: \"c642182e-867b-4579-9289-0347c94a1e0a\") " pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.161611 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c642182e-867b-4579-9289-0347c94a1e0a-operator-scripts\") pod \"keystone-db-create-bwjrj\" (UID: \"c642182e-867b-4579-9289-0347c94a1e0a\") " pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.178323 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-h9gqm"] Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.185311 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ffd\" (UniqueName: \"kubernetes.io/projected/c642182e-867b-4579-9289-0347c94a1e0a-kube-api-access-z5ffd\") pod \"keystone-db-create-bwjrj\" (UID: \"c642182e-867b-4579-9289-0347c94a1e0a\") " pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.187521 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.193232 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-h9gqm"] Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.251176 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-e6ea-account-create-update-4jxg5"] Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.257405 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.261179 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-db-secret\"" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.262911 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzz7g\" (UniqueName: \"kubernetes.io/projected/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-kube-api-access-tzz7g\") pod \"placement-e6ea-account-create-update-4jxg5\" (UID: \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\") " pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.262992 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-operator-scripts\") pod \"placement-e6ea-account-create-update-4jxg5\" (UID: \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\") " pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.263028 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfj2k\" (UniqueName: \"kubernetes.io/projected/aa6ca788-d1ff-472b-905b-98575eb7a814-kube-api-access-kfj2k\") pod \"placement-db-create-h9gqm\" (UID: \"aa6ca788-d1ff-472b-905b-98575eb7a814\") " pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.263122 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa6ca788-d1ff-472b-905b-98575eb7a814-operator-scripts\") pod \"placement-db-create-h9gqm\" (UID: \"aa6ca788-d1ff-472b-905b-98575eb7a814\") " pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.275549 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.280086 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e6ea-account-create-update-4jxg5"] Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.308295 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.364470 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzz7g\" (UniqueName: \"kubernetes.io/projected/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-kube-api-access-tzz7g\") pod \"placement-e6ea-account-create-update-4jxg5\" (UID: \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\") " pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.364586 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-operator-scripts\") pod \"placement-e6ea-account-create-update-4jxg5\" (UID: \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\") " pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.364615 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfj2k\" (UniqueName: \"kubernetes.io/projected/aa6ca788-d1ff-472b-905b-98575eb7a814-kube-api-access-kfj2k\") pod \"placement-db-create-h9gqm\" (UID: \"aa6ca788-d1ff-472b-905b-98575eb7a814\") " pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.364728 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa6ca788-d1ff-472b-905b-98575eb7a814-operator-scripts\") pod \"placement-db-create-h9gqm\" (UID: \"aa6ca788-d1ff-472b-905b-98575eb7a814\") " pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.366625 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa6ca788-d1ff-472b-905b-98575eb7a814-operator-scripts\") pod \"placement-db-create-h9gqm\" (UID: \"aa6ca788-d1ff-472b-905b-98575eb7a814\") " pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.367199 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-operator-scripts\") pod \"placement-e6ea-account-create-update-4jxg5\" (UID: \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\") " pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.381692 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfj2k\" (UniqueName: \"kubernetes.io/projected/aa6ca788-d1ff-472b-905b-98575eb7a814-kube-api-access-kfj2k\") pod \"placement-db-create-h9gqm\" (UID: \"aa6ca788-d1ff-472b-905b-98575eb7a814\") " pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.399584 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzz7g\" (UniqueName: \"kubernetes.io/projected/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-kube-api-access-tzz7g\") pod \"placement-e6ea-account-create-update-4jxg5\" (UID: \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\") " pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.586598 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.594493 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.803755 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bf07-account-create-update-97k2v"] Nov 26 13:50:18 crc kubenswrapper[5200]: W1126 13:50:18.817987 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9e804f8_39b6_4544_8117_3bb9c337973b.slice/crio-ae61fbf0f8b1e804592cd40ba6c6c2abf90cde9ef43d20b32847f097489426b2 WatchSource:0}: Error finding container ae61fbf0f8b1e804592cd40ba6c6c2abf90cde9ef43d20b32847f097489426b2: Status 404 returned error can't find the container with id ae61fbf0f8b1e804592cd40ba6c6c2abf90cde9ef43d20b32847f097489426b2 Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.903498 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bwjrj"] Nov 26 13:50:18 crc kubenswrapper[5200]: I1126 13:50:18.934037 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e6ea-account-create-update-4jxg5"] Nov 26 13:50:18 crc kubenswrapper[5200]: W1126 13:50:18.957462 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe8fae71_4fe0_40cf_bcbc_7ca5e950aef4.slice/crio-f0cd4f8aedd727898107e9b005be0f34f4a37041054c6a3c4e0204202ea1e6d9 WatchSource:0}: Error finding container f0cd4f8aedd727898107e9b005be0f34f4a37041054c6a3c4e0204202ea1e6d9: Status 404 returned error can't find the container with id f0cd4f8aedd727898107e9b005be0f34f4a37041054c6a3c4e0204202ea1e6d9 Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.199015 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-h9gqm"] Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.286826 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.361916 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d8564ff-2dlq2"] Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.370908 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h9gqm" event={"ID":"aa6ca788-d1ff-472b-905b-98575eb7a814","Type":"ContainerStarted","Data":"0662f9a224d48d4dcd30515d955f81afb1a03976c80158dd96b67fee131e0737"} Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.371508 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" podUID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" containerName="dnsmasq-dns" containerID="cri-o://47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458" gracePeriod=10 Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.380066 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bwjrj" event={"ID":"c642182e-867b-4579-9289-0347c94a1e0a","Type":"ContainerStarted","Data":"41da013d26125d68b1a8509a400bc2f1b9e594ecbaa3ee72827d094ece021809"} Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.380317 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bwjrj" event={"ID":"c642182e-867b-4579-9289-0347c94a1e0a","Type":"ContainerStarted","Data":"43a44a0ea16d52efb0ab01aa38879787a99ba8900128705f7eb68754b292f16b"} Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.389616 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e6ea-account-create-update-4jxg5" event={"ID":"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4","Type":"ContainerStarted","Data":"0158049cf93b394d6d7796f9c468a05c7b8d8d0570e52f2d446a83652427fa8c"} Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.389675 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e6ea-account-create-update-4jxg5" event={"ID":"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4","Type":"ContainerStarted","Data":"f0cd4f8aedd727898107e9b005be0f34f4a37041054c6a3c4e0204202ea1e6d9"} Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.405725 5200 generic.go:358] "Generic (PLEG): container finished" podID="d9e804f8-39b6-4544-8117-3bb9c337973b" containerID="7109e9da47f7080d9ecd3d5f3703e4f8c86eeb23cecde84a94b7c73aba425122" exitCode=0 Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.405841 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bf07-account-create-update-97k2v" event={"ID":"d9e804f8-39b6-4544-8117-3bb9c337973b","Type":"ContainerDied","Data":"7109e9da47f7080d9ecd3d5f3703e4f8c86eeb23cecde84a94b7c73aba425122"} Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.405901 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bf07-account-create-update-97k2v" event={"ID":"d9e804f8-39b6-4544-8117-3bb9c337973b","Type":"ContainerStarted","Data":"ae61fbf0f8b1e804592cd40ba6c6c2abf90cde9ef43d20b32847f097489426b2"} Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.431067 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-e6ea-account-create-update-4jxg5" podStartSLOduration=1.431036869 podStartE2EDuration="1.431036869s" podCreationTimestamp="2025-11-26 13:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:19.419543094 +0000 UTC m=+1188.098744912" watchObservedRunningTime="2025-11-26 13:50:19.431036869 +0000 UTC m=+1188.110238687" Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.501496 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:19 crc kubenswrapper[5200]: E1126 13:50:19.501783 5200 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 13:50:19 crc kubenswrapper[5200]: E1126 13:50:19.501822 5200 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 13:50:19 crc kubenswrapper[5200]: E1126 13:50:19.501906 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift podName:9e8c608b-5148-492a-a8a7-b8eccc586561 nodeName:}" failed. No retries permitted until 2025-11-26 13:50:27.501874978 +0000 UTC m=+1196.181076966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift") pod "swift-storage-0" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561") : configmap "swift-ring-files" not found Nov 26 13:50:19 crc kubenswrapper[5200]: I1126 13:50:19.959259 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.009859 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv9x7\" (UniqueName: \"kubernetes.io/projected/9a0b98f9-f27d-429a-a8d0-058f59c8272c-kube-api-access-lv9x7\") pod \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.010030 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-sb\") pod \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.010178 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-dns-svc\") pod \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.010335 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-config\") pod \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.010363 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-nb\") pod \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\" (UID: \"9a0b98f9-f27d-429a-a8d0-058f59c8272c\") " Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.025326 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0b98f9-f27d-429a-a8d0-058f59c8272c-kube-api-access-lv9x7" (OuterVolumeSpecName: "kube-api-access-lv9x7") pod "9a0b98f9-f27d-429a-a8d0-058f59c8272c" (UID: "9a0b98f9-f27d-429a-a8d0-058f59c8272c"). InnerVolumeSpecName "kube-api-access-lv9x7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.097101 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a0b98f9-f27d-429a-a8d0-058f59c8272c" (UID: "9a0b98f9-f27d-429a-a8d0-058f59c8272c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.097129 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-config" (OuterVolumeSpecName: "config") pod "9a0b98f9-f27d-429a-a8d0-058f59c8272c" (UID: "9a0b98f9-f27d-429a-a8d0-058f59c8272c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.098487 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a0b98f9-f27d-429a-a8d0-058f59c8272c" (UID: "9a0b98f9-f27d-429a-a8d0-058f59c8272c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.107256 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a0b98f9-f27d-429a-a8d0-058f59c8272c" (UID: "9a0b98f9-f27d-429a-a8d0-058f59c8272c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.113460 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lv9x7\" (UniqueName: \"kubernetes.io/projected/9a0b98f9-f27d-429a-a8d0-058f59c8272c-kube-api-access-lv9x7\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.113503 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.113513 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.113522 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.113531 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0b98f9-f27d-429a-a8d0-058f59c8272c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.419802 5200 generic.go:358] "Generic (PLEG): container finished" podID="aa6ca788-d1ff-472b-905b-98575eb7a814" containerID="67c84e57c9be8a0f520564f094ee2c05a7164becc48de237903183acadf66bf9" exitCode=0 Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.420721 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h9gqm" event={"ID":"aa6ca788-d1ff-472b-905b-98575eb7a814","Type":"ContainerDied","Data":"67c84e57c9be8a0f520564f094ee2c05a7164becc48de237903183acadf66bf9"} Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.426438 5200 generic.go:358] "Generic (PLEG): container finished" podID="c642182e-867b-4579-9289-0347c94a1e0a" containerID="41da013d26125d68b1a8509a400bc2f1b9e594ecbaa3ee72827d094ece021809" exitCode=0 Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.426561 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bwjrj" event={"ID":"c642182e-867b-4579-9289-0347c94a1e0a","Type":"ContainerDied","Data":"41da013d26125d68b1a8509a400bc2f1b9e594ecbaa3ee72827d094ece021809"} Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.436550 5200 generic.go:358] "Generic (PLEG): container finished" podID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" containerID="76726c7914d31f34afba939283a29384999603044a931c1bcc9aba2f60a43f99" exitCode=0 Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.436665 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5f5ea8d4-ba2b-4abe-b645-0513268421f9","Type":"ContainerDied","Data":"76726c7914d31f34afba939283a29384999603044a931c1bcc9aba2f60a43f99"} Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.444482 5200 generic.go:358] "Generic (PLEG): container finished" podID="fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4" containerID="0158049cf93b394d6d7796f9c468a05c7b8d8d0570e52f2d446a83652427fa8c" exitCode=0 Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.444779 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e6ea-account-create-update-4jxg5" event={"ID":"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4","Type":"ContainerDied","Data":"0158049cf93b394d6d7796f9c468a05c7b8d8d0570e52f2d446a83652427fa8c"} Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.467311 5200 generic.go:358] "Generic (PLEG): container finished" podID="4c5390ed-d89e-4450-adf3-1363b1150d1b" containerID="fc8ecc37de36174c47bf0b036cad135c5bf8af671bf2e44d522e81e85324b227" exitCode=0 Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.467415 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c5390ed-d89e-4450-adf3-1363b1150d1b","Type":"ContainerDied","Data":"fc8ecc37de36174c47bf0b036cad135c5bf8af671bf2e44d522e81e85324b227"} Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.477239 5200 generic.go:358] "Generic (PLEG): container finished" podID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" containerID="47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458" exitCode=0 Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.477628 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.477955 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" event={"ID":"9a0b98f9-f27d-429a-a8d0-058f59c8272c","Type":"ContainerDied","Data":"47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458"} Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.478053 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8564ff-2dlq2" event={"ID":"9a0b98f9-f27d-429a-a8d0-058f59c8272c","Type":"ContainerDied","Data":"1466484d34e3f8a3be3fd371fa2de2639bfe606eba513c2fe84087cbb1f06f7f"} Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.478093 5200 scope.go:117] "RemoveContainer" containerID="47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.601271 5200 scope.go:117] "RemoveContainer" containerID="ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.620222 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d8564ff-2dlq2"] Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.626487 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d8564ff-2dlq2"] Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.663033 5200 scope.go:117] "RemoveContainer" containerID="47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458" Nov 26 13:50:20 crc kubenswrapper[5200]: E1126 13:50:20.663712 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458\": container with ID starting with 47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458 not found: ID does not exist" containerID="47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.663765 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458"} err="failed to get container status \"47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458\": rpc error: code = NotFound desc = could not find container \"47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458\": container with ID starting with 47273ee9ecef33a0ca5eb2b651b52322fa23f4c8352b8f96ac34b179d3558458 not found: ID does not exist" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.663798 5200 scope.go:117] "RemoveContainer" containerID="ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e" Nov 26 13:50:20 crc kubenswrapper[5200]: E1126 13:50:20.664252 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e\": container with ID starting with ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e not found: ID does not exist" containerID="ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.664312 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e"} err="failed to get container status \"ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e\": rpc error: code = NotFound desc = could not find container \"ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e\": container with ID starting with ea753bb3fb17774221ca201221a705e864965ca7715434722a615003b187378e not found: ID does not exist" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.982170 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:20 crc kubenswrapper[5200]: I1126 13:50:20.985235 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" path="/var/lib/kubelet/pods/9a0b98f9-f27d-429a-a8d0-058f59c8272c/volumes" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.035057 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.046492 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5ffd\" (UniqueName: \"kubernetes.io/projected/c642182e-867b-4579-9289-0347c94a1e0a-kube-api-access-z5ffd\") pod \"c642182e-867b-4579-9289-0347c94a1e0a\" (UID: \"c642182e-867b-4579-9289-0347c94a1e0a\") " Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.046889 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c642182e-867b-4579-9289-0347c94a1e0a-operator-scripts\") pod \"c642182e-867b-4579-9289-0347c94a1e0a\" (UID: \"c642182e-867b-4579-9289-0347c94a1e0a\") " Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.047774 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c642182e-867b-4579-9289-0347c94a1e0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c642182e-867b-4579-9289-0347c94a1e0a" (UID: "c642182e-867b-4579-9289-0347c94a1e0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.054703 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c642182e-867b-4579-9289-0347c94a1e0a-kube-api-access-z5ffd" (OuterVolumeSpecName: "kube-api-access-z5ffd") pod "c642182e-867b-4579-9289-0347c94a1e0a" (UID: "c642182e-867b-4579-9289-0347c94a1e0a"). InnerVolumeSpecName "kube-api-access-z5ffd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.149426 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpz7p\" (UniqueName: \"kubernetes.io/projected/d9e804f8-39b6-4544-8117-3bb9c337973b-kube-api-access-xpz7p\") pod \"d9e804f8-39b6-4544-8117-3bb9c337973b\" (UID: \"d9e804f8-39b6-4544-8117-3bb9c337973b\") " Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.149671 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e804f8-39b6-4544-8117-3bb9c337973b-operator-scripts\") pod \"d9e804f8-39b6-4544-8117-3bb9c337973b\" (UID: \"d9e804f8-39b6-4544-8117-3bb9c337973b\") " Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.150075 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c642182e-867b-4579-9289-0347c94a1e0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.150098 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5ffd\" (UniqueName: \"kubernetes.io/projected/c642182e-867b-4579-9289-0347c94a1e0a-kube-api-access-z5ffd\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.150588 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e804f8-39b6-4544-8117-3bb9c337973b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9e804f8-39b6-4544-8117-3bb9c337973b" (UID: "d9e804f8-39b6-4544-8117-3bb9c337973b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.154959 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e804f8-39b6-4544-8117-3bb9c337973b-kube-api-access-xpz7p" (OuterVolumeSpecName: "kube-api-access-xpz7p") pod "d9e804f8-39b6-4544-8117-3bb9c337973b" (UID: "d9e804f8-39b6-4544-8117-3bb9c337973b"). InnerVolumeSpecName "kube-api-access-xpz7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.252972 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpz7p\" (UniqueName: \"kubernetes.io/projected/d9e804f8-39b6-4544-8117-3bb9c337973b-kube-api-access-xpz7p\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.253058 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e804f8-39b6-4544-8117-3bb9c337973b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.490076 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bwjrj" event={"ID":"c642182e-867b-4579-9289-0347c94a1e0a","Type":"ContainerDied","Data":"43a44a0ea16d52efb0ab01aa38879787a99ba8900128705f7eb68754b292f16b"} Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.490129 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43a44a0ea16d52efb0ab01aa38879787a99ba8900128705f7eb68754b292f16b" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.490084 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bwjrj" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.494458 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5f5ea8d4-ba2b-4abe-b645-0513268421f9","Type":"ContainerStarted","Data":"78c4512cda26fc1c5231bc7d98d45ac8739cbb21786dfbb0e443ec8c003043b6"} Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.495333 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-server-0" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.499004 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c5390ed-d89e-4450-adf3-1363b1150d1b","Type":"ContainerStarted","Data":"6b70899d8462595d68160840ba2f772a58ad47466f6d54dfc39939ee3313bcb9"} Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.499374 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.504931 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf07-account-create-update-97k2v" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.504956 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bf07-account-create-update-97k2v" event={"ID":"d9e804f8-39b6-4544-8117-3bb9c337973b","Type":"ContainerDied","Data":"ae61fbf0f8b1e804592cd40ba6c6c2abf90cde9ef43d20b32847f097489426b2"} Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.505040 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae61fbf0f8b1e804592cd40ba6c6c2abf90cde9ef43d20b32847f097489426b2" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.541975 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.449317196 podStartE2EDuration="57.541950881s" podCreationTimestamp="2025-11-26 13:49:24 +0000 UTC" firstStartedPulling="2025-11-26 13:49:25.98929951 +0000 UTC m=+1134.668501328" lastFinishedPulling="2025-11-26 13:49:43.081933205 +0000 UTC m=+1151.761135013" observedRunningTime="2025-11-26 13:50:21.526747608 +0000 UTC m=+1190.205949426" watchObservedRunningTime="2025-11-26 13:50:21.541950881 +0000 UTC m=+1190.221152699" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.578343 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.748605257 podStartE2EDuration="58.578309286s" podCreationTimestamp="2025-11-26 13:49:23 +0000 UTC" firstStartedPulling="2025-11-26 13:49:25.045371823 +0000 UTC m=+1133.724573641" lastFinishedPulling="2025-11-26 13:49:42.875075852 +0000 UTC m=+1151.554277670" observedRunningTime="2025-11-26 13:50:21.566086682 +0000 UTC m=+1190.245288530" watchObservedRunningTime="2025-11-26 13:50:21.578309286 +0000 UTC m=+1190.257511124" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.980736 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:21 crc kubenswrapper[5200]: I1126 13:50:21.988225 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.104021 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa6ca788-d1ff-472b-905b-98575eb7a814-operator-scripts\") pod \"aa6ca788-d1ff-472b-905b-98575eb7a814\" (UID: \"aa6ca788-d1ff-472b-905b-98575eb7a814\") " Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.104154 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfj2k\" (UniqueName: \"kubernetes.io/projected/aa6ca788-d1ff-472b-905b-98575eb7a814-kube-api-access-kfj2k\") pod \"aa6ca788-d1ff-472b-905b-98575eb7a814\" (UID: \"aa6ca788-d1ff-472b-905b-98575eb7a814\") " Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.104323 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzz7g\" (UniqueName: \"kubernetes.io/projected/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-kube-api-access-tzz7g\") pod \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\" (UID: \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\") " Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.104403 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-operator-scripts\") pod \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\" (UID: \"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4\") " Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.105942 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa6ca788-d1ff-472b-905b-98575eb7a814-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa6ca788-d1ff-472b-905b-98575eb7a814" (UID: "aa6ca788-d1ff-472b-905b-98575eb7a814"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.109065 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4" (UID: "fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.112297 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-kube-api-access-tzz7g" (OuterVolumeSpecName: "kube-api-access-tzz7g") pod "fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4" (UID: "fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4"). InnerVolumeSpecName "kube-api-access-tzz7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.112880 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6ca788-d1ff-472b-905b-98575eb7a814-kube-api-access-kfj2k" (OuterVolumeSpecName: "kube-api-access-kfj2k") pod "aa6ca788-d1ff-472b-905b-98575eb7a814" (UID: "aa6ca788-d1ff-472b-905b-98575eb7a814"). InnerVolumeSpecName "kube-api-access-kfj2k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.206225 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kfj2k\" (UniqueName: \"kubernetes.io/projected/aa6ca788-d1ff-472b-905b-98575eb7a814-kube-api-access-kfj2k\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.206293 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzz7g\" (UniqueName: \"kubernetes.io/projected/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-kube-api-access-tzz7g\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.206308 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.206319 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa6ca788-d1ff-472b-905b-98575eb7a814-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.518591 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h9gqm" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.518625 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h9gqm" event={"ID":"aa6ca788-d1ff-472b-905b-98575eb7a814","Type":"ContainerDied","Data":"0662f9a224d48d4dcd30515d955f81afb1a03976c80158dd96b67fee131e0737"} Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.518867 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0662f9a224d48d4dcd30515d955f81afb1a03976c80158dd96b67fee131e0737" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.522461 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e6ea-account-create-update-4jxg5" Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.522467 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e6ea-account-create-update-4jxg5" event={"ID":"fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4","Type":"ContainerDied","Data":"f0cd4f8aedd727898107e9b005be0f34f4a37041054c6a3c4e0204202ea1e6d9"} Nov 26 13:50:22 crc kubenswrapper[5200]: I1126 13:50:22.522528 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0cd4f8aedd727898107e9b005be0f34f4a37041054c6a3c4e0204202ea1e6d9" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.262378 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4b9z9"] Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.263857 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" containerName="dnsmasq-dns" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.263952 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" containerName="dnsmasq-dns" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264035 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa6ca788-d1ff-472b-905b-98575eb7a814" containerName="mariadb-database-create" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264106 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6ca788-d1ff-472b-905b-98575eb7a814" containerName="mariadb-database-create" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264200 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9e804f8-39b6-4544-8117-3bb9c337973b" containerName="mariadb-account-create-update" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264257 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e804f8-39b6-4544-8117-3bb9c337973b" containerName="mariadb-account-create-update" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264314 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4" containerName="mariadb-account-create-update" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264370 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4" containerName="mariadb-account-create-update" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264476 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c642182e-867b-4579-9289-0347c94a1e0a" containerName="mariadb-database-create" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264541 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c642182e-867b-4579-9289-0347c94a1e0a" containerName="mariadb-database-create" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264605 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" containerName="init" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264657 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" containerName="init" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264895 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9e804f8-39b6-4544-8117-3bb9c337973b" containerName="mariadb-account-create-update" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.264968 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4" containerName="mariadb-account-create-update" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.265029 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa6ca788-d1ff-472b-905b-98575eb7a814" containerName="mariadb-database-create" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.265083 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c642182e-867b-4579-9289-0347c94a1e0a" containerName="mariadb-database-create" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.265144 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a0b98f9-f27d-429a-a8d0-058f59c8272c" containerName="dnsmasq-dns" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.901085 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4b9z9"] Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.901592 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-4584-account-create-update-cnvc7"] Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.901298 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.949660 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxpf\" (UniqueName: \"kubernetes.io/projected/b57a6d2a-4df7-4101-a9ac-edd0e76be131-kube-api-access-5hxpf\") pod \"glance-db-create-4b9z9\" (UID: \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\") " pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.950259 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57a6d2a-4df7-4101-a9ac-edd0e76be131-operator-scripts\") pod \"glance-db-create-4b9z9\" (UID: \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\") " pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.953160 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4584-account-create-update-cnvc7"] Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.953329 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:23 crc kubenswrapper[5200]: I1126 13:50:23.955629 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-db-secret\"" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.052645 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-operator-scripts\") pod \"glance-4584-account-create-update-cnvc7\" (UID: \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\") " pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.052740 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57a6d2a-4df7-4101-a9ac-edd0e76be131-operator-scripts\") pod \"glance-db-create-4b9z9\" (UID: \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\") " pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.052796 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz46z\" (UniqueName: \"kubernetes.io/projected/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-kube-api-access-fz46z\") pod \"glance-4584-account-create-update-cnvc7\" (UID: \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\") " pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.052886 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxpf\" (UniqueName: \"kubernetes.io/projected/b57a6d2a-4df7-4101-a9ac-edd0e76be131-kube-api-access-5hxpf\") pod \"glance-db-create-4b9z9\" (UID: \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\") " pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.054150 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57a6d2a-4df7-4101-a9ac-edd0e76be131-operator-scripts\") pod \"glance-db-create-4b9z9\" (UID: \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\") " pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.085978 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxpf\" (UniqueName: \"kubernetes.io/projected/b57a6d2a-4df7-4101-a9ac-edd0e76be131-kube-api-access-5hxpf\") pod \"glance-db-create-4b9z9\" (UID: \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\") " pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.155107 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-operator-scripts\") pod \"glance-4584-account-create-update-cnvc7\" (UID: \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\") " pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.155202 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fz46z\" (UniqueName: \"kubernetes.io/projected/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-kube-api-access-fz46z\") pod \"glance-4584-account-create-update-cnvc7\" (UID: \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\") " pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.156022 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-operator-scripts\") pod \"glance-4584-account-create-update-cnvc7\" (UID: \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\") " pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.173968 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz46z\" (UniqueName: \"kubernetes.io/projected/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-kube-api-access-fz46z\") pod \"glance-4584-account-create-update-cnvc7\" (UID: \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\") " pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.225434 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:24 crc kubenswrapper[5200]: I1126 13:50:24.269502 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:25 crc kubenswrapper[5200]: I1126 13:50:25.640235 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4584-account-create-update-cnvc7"] Nov 26 13:50:25 crc kubenswrapper[5200]: I1126 13:50:25.757628 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4b9z9"] Nov 26 13:50:25 crc kubenswrapper[5200]: W1126 13:50:25.771361 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb57a6d2a_4df7_4101_a9ac_edd0e76be131.slice/crio-24e93f17fae07f549ec06896fb840ea178fb6f699ad689af104972381a2e4fc0 WatchSource:0}: Error finding container 24e93f17fae07f549ec06896fb840ea178fb6f699ad689af104972381a2e4fc0: Status 404 returned error can't find the container with id 24e93f17fae07f549ec06896fb840ea178fb6f699ad689af104972381a2e4fc0 Nov 26 13:50:26 crc kubenswrapper[5200]: I1126 13:50:26.559879 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4b9z9" event={"ID":"b57a6d2a-4df7-4101-a9ac-edd0e76be131","Type":"ContainerStarted","Data":"24e93f17fae07f549ec06896fb840ea178fb6f699ad689af104972381a2e4fc0"} Nov 26 13:50:26 crc kubenswrapper[5200]: I1126 13:50:26.562212 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4584-account-create-update-cnvc7" event={"ID":"0744fa2d-9b22-413a-8d7d-549c7c5c1fac","Type":"ContainerStarted","Data":"a37d07bf381f0427274230648273ab3a32eb06c4ac34683781cf1633a6574ce9"} Nov 26 13:50:26 crc kubenswrapper[5200]: I1126 13:50:26.566432 5200 generic.go:358] "Generic (PLEG): container finished" podID="f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" containerID="7a8141c0c332f8af8b48e4fd35e84bf7b0c089845033f6d2d80786d4daaee498" exitCode=0 Nov 26 13:50:26 crc kubenswrapper[5200]: I1126 13:50:26.566589 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wn5kl" event={"ID":"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6","Type":"ContainerDied","Data":"7a8141c0c332f8af8b48e4fd35e84bf7b0c089845033f6d2d80786d4daaee498"} Nov 26 13:50:27 crc kubenswrapper[5200]: I1126 13:50:27.528178 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:27 crc kubenswrapper[5200]: I1126 13:50:27.543873 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"swift-storage-0\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " pod="openstack/swift-storage-0" Nov 26 13:50:27 crc kubenswrapper[5200]: I1126 13:50:27.579395 5200 generic.go:358] "Generic (PLEG): container finished" podID="b57a6d2a-4df7-4101-a9ac-edd0e76be131" containerID="b1467a6c8e757d0cebe3388af58f90782af6cdaa6f141e954be99d27e7210802" exitCode=0 Nov 26 13:50:27 crc kubenswrapper[5200]: I1126 13:50:27.579551 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4b9z9" event={"ID":"b57a6d2a-4df7-4101-a9ac-edd0e76be131","Type":"ContainerDied","Data":"b1467a6c8e757d0cebe3388af58f90782af6cdaa6f141e954be99d27e7210802"} Nov 26 13:50:27 crc kubenswrapper[5200]: I1126 13:50:27.581674 5200 generic.go:358] "Generic (PLEG): container finished" podID="0744fa2d-9b22-413a-8d7d-549c7c5c1fac" containerID="dd1df3fe3624bda2e4605d659f481bd396a2428753503441f5cac1b6400e3793" exitCode=0 Nov 26 13:50:27 crc kubenswrapper[5200]: I1126 13:50:27.582386 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4584-account-create-update-cnvc7" event={"ID":"0744fa2d-9b22-413a-8d7d-549c7c5c1fac","Type":"ContainerDied","Data":"dd1df3fe3624bda2e4605d659f481bd396a2428753503441f5cac1b6400e3793"} Nov 26 13:50:27 crc kubenswrapper[5200]: I1126 13:50:27.628110 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.034853 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.140418 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-combined-ca-bundle\") pod \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.140667 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-scripts\") pod \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.140768 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6c4r\" (UniqueName: \"kubernetes.io/projected/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-kube-api-access-c6c4r\") pod \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.140872 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-ring-data-devices\") pod \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.141067 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-swiftconf\") pod \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.141142 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-etc-swift\") pod \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.141217 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-dispersionconf\") pod \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\" (UID: \"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6\") " Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.144229 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" (UID: "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.144737 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" (UID: "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.149542 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-kube-api-access-c6c4r" (OuterVolumeSpecName: "kube-api-access-c6c4r") pod "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" (UID: "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6"). InnerVolumeSpecName "kube-api-access-c6c4r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.152671 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" (UID: "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.169830 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" (UID: "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.171860 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-scripts" (OuterVolumeSpecName: "scripts") pod "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" (UID: "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.179339 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" (UID: "f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.243319 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.243830 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6c4r\" (UniqueName: \"kubernetes.io/projected/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-kube-api-access-c6c4r\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.243848 5200 reconciler_common.go:299] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.243926 5200 reconciler_common.go:299] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.243940 5200 reconciler_common.go:299] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.243950 5200 reconciler_common.go:299] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.243966 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.323911 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 26 13:50:28 crc kubenswrapper[5200]: W1126 13:50:28.332914 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e8c608b_5148_492a_a8a7_b8eccc586561.slice/crio-48497cc22cc886617125fd444649afd27364fb12110d4bab92a3a69eb60bd8b4 WatchSource:0}: Error finding container 48497cc22cc886617125fd444649afd27364fb12110d4bab92a3a69eb60bd8b4: Status 404 returned error can't find the container with id 48497cc22cc886617125fd444649afd27364fb12110d4bab92a3a69eb60bd8b4 Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.601993 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wn5kl" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.602129 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wn5kl" event={"ID":"f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6","Type":"ContainerDied","Data":"c23499f6fa04ab52a4512c676cafea5cb7c2ffc03d559e9236fa0461624404f3"} Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.602233 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c23499f6fa04ab52a4512c676cafea5cb7c2ffc03d559e9236fa0461624404f3" Nov 26 13:50:28 crc kubenswrapper[5200]: I1126 13:50:28.605804 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"48497cc22cc886617125fd444649afd27364fb12110d4bab92a3a69eb60bd8b4"} Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.132661 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kp5s8" podUID="743192ea-7611-42dc-ab35-0dd584590479" containerName="ovn-controller" probeResult="failure" output=< Nov 26 13:50:29 crc kubenswrapper[5200]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 26 13:50:29 crc kubenswrapper[5200]: > Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.139976 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.267841 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz46z\" (UniqueName: \"kubernetes.io/projected/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-kube-api-access-fz46z\") pod \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\" (UID: \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\") " Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.268351 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-operator-scripts\") pod \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\" (UID: \"0744fa2d-9b22-413a-8d7d-549c7c5c1fac\") " Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.269400 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0744fa2d-9b22-413a-8d7d-549c7c5c1fac" (UID: "0744fa2d-9b22-413a-8d7d-549c7c5c1fac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.282169 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-kube-api-access-fz46z" (OuterVolumeSpecName: "kube-api-access-fz46z") pod "0744fa2d-9b22-413a-8d7d-549c7c5c1fac" (UID: "0744fa2d-9b22-413a-8d7d-549c7c5c1fac"). InnerVolumeSpecName "kube-api-access-fz46z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.286801 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.371308 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.371353 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fz46z\" (UniqueName: \"kubernetes.io/projected/0744fa2d-9b22-413a-8d7d-549c7c5c1fac-kube-api-access-fz46z\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.472416 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hxpf\" (UniqueName: \"kubernetes.io/projected/b57a6d2a-4df7-4101-a9ac-edd0e76be131-kube-api-access-5hxpf\") pod \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\" (UID: \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\") " Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.472640 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57a6d2a-4df7-4101-a9ac-edd0e76be131-operator-scripts\") pod \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\" (UID: \"b57a6d2a-4df7-4101-a9ac-edd0e76be131\") " Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.473941 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b57a6d2a-4df7-4101-a9ac-edd0e76be131-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b57a6d2a-4df7-4101-a9ac-edd0e76be131" (UID: "b57a6d2a-4df7-4101-a9ac-edd0e76be131"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.478630 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57a6d2a-4df7-4101-a9ac-edd0e76be131-kube-api-access-5hxpf" (OuterVolumeSpecName: "kube-api-access-5hxpf") pod "b57a6d2a-4df7-4101-a9ac-edd0e76be131" (UID: "b57a6d2a-4df7-4101-a9ac-edd0e76be131"). InnerVolumeSpecName "kube-api-access-5hxpf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.576003 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b57a6d2a-4df7-4101-a9ac-edd0e76be131-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.576048 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5hxpf\" (UniqueName: \"kubernetes.io/projected/b57a6d2a-4df7-4101-a9ac-edd0e76be131-kube-api-access-5hxpf\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.619155 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4b9z9" event={"ID":"b57a6d2a-4df7-4101-a9ac-edd0e76be131","Type":"ContainerDied","Data":"24e93f17fae07f549ec06896fb840ea178fb6f699ad689af104972381a2e4fc0"} Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.619708 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24e93f17fae07f549ec06896fb840ea178fb6f699ad689af104972381a2e4fc0" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.619170 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4b9z9" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.633371 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4584-account-create-update-cnvc7" event={"ID":"0744fa2d-9b22-413a-8d7d-549c7c5c1fac","Type":"ContainerDied","Data":"a37d07bf381f0427274230648273ab3a32eb06c4ac34683781cf1633a6574ce9"} Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.633443 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a37d07bf381f0427274230648273ab3a32eb06c4ac34683781cf1633a6574ce9" Nov 26 13:50:29 crc kubenswrapper[5200]: I1126 13:50:29.633627 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4584-account-create-update-cnvc7" Nov 26 13:50:30 crc kubenswrapper[5200]: I1126 13:50:30.649751 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"42f52ac7a2a3d8ce2846507f56c360127f1cd001844f1aa503e54d119743673b"} Nov 26 13:50:30 crc kubenswrapper[5200]: I1126 13:50:30.650360 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"2eb02dc54fa136e7c5025e9634090a7041d9069cad341af2a5616acb5c290480"} Nov 26 13:50:30 crc kubenswrapper[5200]: I1126 13:50:30.650378 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"87378e41d86dff6f16d62f1a53be1da15948d42fc9a54646c7077b0037089474"} Nov 26 13:50:30 crc kubenswrapper[5200]: I1126 13:50:30.650393 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"903ac30a9dbec48f735e990bea93a6806f1762703320ed6711cbac6fbff79c06"} Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.151192 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.155388 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.421654 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kp5s8-config-ghrxw"] Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423260 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" containerName="swift-ring-rebalance" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423278 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" containerName="swift-ring-rebalance" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423293 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b57a6d2a-4df7-4101-a9ac-edd0e76be131" containerName="mariadb-database-create" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423300 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57a6d2a-4df7-4101-a9ac-edd0e76be131" containerName="mariadb-database-create" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423339 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0744fa2d-9b22-413a-8d7d-549c7c5c1fac" containerName="mariadb-account-create-update" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423348 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0744fa2d-9b22-413a-8d7d-549c7c5c1fac" containerName="mariadb-account-create-update" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423517 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" containerName="swift-ring-rebalance" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423526 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0744fa2d-9b22-413a-8d7d-549c7c5c1fac" containerName="mariadb-account-create-update" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.423540 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b57a6d2a-4df7-4101-a9ac-edd0e76be131" containerName="mariadb-database-create" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.428746 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.431355 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-extra-scripts\"" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.433654 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kp5s8-config-ghrxw"] Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.533743 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.537403 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.564088 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-additional-scripts\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.564157 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.564195 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-scripts\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.564245 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-log-ovn\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.564273 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run-ovn\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.564335 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24sbb\" (UniqueName: \"kubernetes.io/projected/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-kube-api-access-24sbb\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.666189 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24sbb\" (UniqueName: \"kubernetes.io/projected/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-kube-api-access-24sbb\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.666257 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-additional-scripts\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.666344 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.666414 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-scripts\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.666511 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-log-ovn\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.666563 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run-ovn\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.668161 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run-ovn\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.668213 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-log-ovn\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.669346 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.669739 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-additional-scripts\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.670307 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-scripts\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.675893 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"f5ee039fbe62517040d20e45c50d6d689b188548be0f48205093fbd0376ceb73"} Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.675938 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"73f91d32af16b5469f8f5d9eeb6ad5b3a0299f66553fb69a26e37fd1e4724334"} Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.675957 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"2ab0a7853e519d877df1c22e5ebeb99b1c64c7a449e720446cdb4ecc46d44e12"} Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.675966 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"8b51fec0244bcdc2b862bc7cdd69319c60e9093b1b1eee3641c5eec3c4239828"} Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.703298 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24sbb\" (UniqueName: \"kubernetes.io/projected/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-kube-api-access-24sbb\") pod \"ovn-controller-kp5s8-config-ghrxw\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.778088 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:32 crc kubenswrapper[5200]: I1126 13:50:32.967212 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-txf2x"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.105816 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.108885 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-0a47-account-create-update-n4bxl"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.116873 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-txf2x"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.117009 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.122022 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-db-secret\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.124623 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0a47-account-create-update-n4bxl"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.133304 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-69qj9"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.147188 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.154816 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.154894 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.176065 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-69qj9"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.239521 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-f459-account-create-update-rcg98"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.245770 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.251989 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-db-secret\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.258818 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-operator-scripts\") pod \"cinder-db-create-69qj9\" (UID: \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\") " pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.258888 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpp5m\" (UniqueName: \"kubernetes.io/projected/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-kube-api-access-xpp5m\") pod \"barbican-0a47-account-create-update-n4bxl\" (UID: \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\") " pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.258924 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-operator-scripts\") pod \"barbican-db-create-txf2x\" (UID: \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\") " pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.258957 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-operator-scripts\") pod \"barbican-0a47-account-create-update-n4bxl\" (UID: \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\") " pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.258984 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv4ks\" (UniqueName: \"kubernetes.io/projected/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-kube-api-access-lv4ks\") pod \"cinder-db-create-69qj9\" (UID: \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\") " pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.259037 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6srkr\" (UniqueName: \"kubernetes.io/projected/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-kube-api-access-6srkr\") pod \"barbican-db-create-txf2x\" (UID: \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\") " pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.269334 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f459-account-create-update-rcg98"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.353470 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-gsmtt"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365149 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365256 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-operator-scripts\") pod \"cinder-f459-account-create-update-rcg98\" (UID: \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\") " pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365322 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2snx\" (UniqueName: \"kubernetes.io/projected/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-kube-api-access-j2snx\") pod \"cinder-f459-account-create-update-rcg98\" (UID: \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\") " pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365400 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-operator-scripts\") pod \"cinder-db-create-69qj9\" (UID: \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\") " pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365460 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpp5m\" (UniqueName: \"kubernetes.io/projected/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-kube-api-access-xpp5m\") pod \"barbican-0a47-account-create-update-n4bxl\" (UID: \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\") " pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365503 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-operator-scripts\") pod \"barbican-db-create-txf2x\" (UID: \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\") " pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365556 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-operator-scripts\") pod \"barbican-0a47-account-create-update-n4bxl\" (UID: \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\") " pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365591 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv4ks\" (UniqueName: \"kubernetes.io/projected/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-kube-api-access-lv4ks\") pod \"cinder-db-create-69qj9\" (UID: \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\") " pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.365699 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6srkr\" (UniqueName: \"kubernetes.io/projected/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-kube-api-access-6srkr\") pod \"barbican-db-create-txf2x\" (UID: \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\") " pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.367123 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-operator-scripts\") pod \"cinder-db-create-69qj9\" (UID: \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\") " pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.368460 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-operator-scripts\") pod \"barbican-0a47-account-create-update-n4bxl\" (UID: \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\") " pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.369106 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.369284 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-operator-scripts\") pod \"barbican-db-create-txf2x\" (UID: \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\") " pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.369398 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.375772 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.376069 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-xzmjl\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.379750 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-ca07-account-create-update-zddj2"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.390518 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7lcx4"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.391062 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.394938 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-db-secret\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.401584 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-gsmtt"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.401836 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.406081 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpp5m\" (UniqueName: \"kubernetes.io/projected/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-kube-api-access-xpp5m\") pod \"barbican-0a47-account-create-update-n4bxl\" (UID: \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\") " pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.428103 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv4ks\" (UniqueName: \"kubernetes.io/projected/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-kube-api-access-lv4ks\") pod \"cinder-db-create-69qj9\" (UID: \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\") " pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.436442 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6srkr\" (UniqueName: \"kubernetes.io/projected/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-kube-api-access-6srkr\") pod \"barbican-db-create-txf2x\" (UID: \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\") " pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.445208 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ca07-account-create-update-zddj2"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.467448 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-operator-scripts\") pod \"cinder-f459-account-create-update-rcg98\" (UID: \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\") " pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.467530 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2dcn\" (UniqueName: \"kubernetes.io/projected/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-kube-api-access-b2dcn\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.467562 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2snx\" (UniqueName: \"kubernetes.io/projected/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-kube-api-access-j2snx\") pod \"cinder-f459-account-create-update-rcg98\" (UID: \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\") " pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.467583 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-config-data\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.467603 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-combined-ca-bundle\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.468615 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-operator-scripts\") pod \"cinder-f459-account-create-update-rcg98\" (UID: \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\") " pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.495736 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2snx\" (UniqueName: \"kubernetes.io/projected/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-kube-api-access-j2snx\") pod \"cinder-f459-account-create-update-rcg98\" (UID: \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\") " pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.496751 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.515130 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.517908 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7lcx4"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.524297 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.545058 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kp5s8-config-ghrxw"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.569396 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2dcn\" (UniqueName: \"kubernetes.io/projected/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-kube-api-access-b2dcn\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.569446 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-config-data\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.569500 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-combined-ca-bundle\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.569526 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sktm5\" (UniqueName: \"kubernetes.io/projected/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-kube-api-access-sktm5\") pod \"neutron-db-create-7lcx4\" (UID: \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\") " pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.569581 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbc6b47-81e8-4999-9cd8-1744a64e8158-operator-scripts\") pod \"neutron-ca07-account-create-update-zddj2\" (UID: \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\") " pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.569608 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-operator-scripts\") pod \"neutron-db-create-7lcx4\" (UID: \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\") " pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.569652 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.569894 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6fq\" (UniqueName: \"kubernetes.io/projected/2bbc6b47-81e8-4999-9cd8-1744a64e8158-kube-api-access-6n6fq\") pod \"neutron-ca07-account-create-update-zddj2\" (UID: \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\") " pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.573815 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-config-data\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.577953 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-combined-ca-bundle\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.587304 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2dcn\" (UniqueName: \"kubernetes.io/projected/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-kube-api-access-b2dcn\") pod \"keystone-db-sync-gsmtt\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.672059 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sktm5\" (UniqueName: \"kubernetes.io/projected/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-kube-api-access-sktm5\") pod \"neutron-db-create-7lcx4\" (UID: \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\") " pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.672414 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbc6b47-81e8-4999-9cd8-1744a64e8158-operator-scripts\") pod \"neutron-ca07-account-create-update-zddj2\" (UID: \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\") " pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.672438 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-operator-scripts\") pod \"neutron-db-create-7lcx4\" (UID: \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\") " pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.672525 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6fq\" (UniqueName: \"kubernetes.io/projected/2bbc6b47-81e8-4999-9cd8-1744a64e8158-kube-api-access-6n6fq\") pod \"neutron-ca07-account-create-update-zddj2\" (UID: \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\") " pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.673438 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbc6b47-81e8-4999-9cd8-1744a64e8158-operator-scripts\") pod \"neutron-ca07-account-create-update-zddj2\" (UID: \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\") " pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.673826 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-operator-scripts\") pod \"neutron-db-create-7lcx4\" (UID: \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\") " pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.687819 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kp5s8-config-ghrxw" event={"ID":"f5980f0e-828c-494c-b17e-ddf80d7dc3a8","Type":"ContainerStarted","Data":"de42233641bc98324f0b4a3590f9160c5fbaa0299d9bf9899725d467b76a2eab"} Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.695108 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sktm5\" (UniqueName: \"kubernetes.io/projected/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-kube-api-access-sktm5\") pod \"neutron-db-create-7lcx4\" (UID: \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\") " pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.704917 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.708930 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6fq\" (UniqueName: \"kubernetes.io/projected/2bbc6b47-81e8-4999-9cd8-1744a64e8158-kube-api-access-6n6fq\") pod \"neutron-ca07-account-create-update-zddj2\" (UID: \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\") " pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.758745 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-l8nht"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.764723 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.770267 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-config-data\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.771794 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-q4wxh\"" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.782311 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l8nht"] Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.851366 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.868744 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.876826 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-config-data\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.877168 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-db-sync-config-data\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.877370 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9fms\" (UniqueName: \"kubernetes.io/projected/dd3dd593-fa3c-491a-9f97-84f0dee04150-kube-api-access-p9fms\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.877532 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-combined-ca-bundle\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:33 crc kubenswrapper[5200]: I1126 13:50:33.999745 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-db-sync-config-data\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:33.999825 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9fms\" (UniqueName: \"kubernetes.io/projected/dd3dd593-fa3c-491a-9f97-84f0dee04150-kube-api-access-p9fms\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:33.999861 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-combined-ca-bundle\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:33.999949 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-config-data\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.018575 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-db-sync-config-data\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.021228 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-config-data\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.025103 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9fms\" (UniqueName: \"kubernetes.io/projected/dd3dd593-fa3c-491a-9f97-84f0dee04150-kube-api-access-p9fms\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.051473 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-combined-ca-bundle\") pod \"glance-db-sync-l8nht\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.088364 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l8nht" Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.117486 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kp5s8" podUID="743192ea-7611-42dc-ab35-0dd584590479" containerName="ovn-controller" probeResult="failure" output=< Nov 26 13:50:34 crc kubenswrapper[5200]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 26 13:50:34 crc kubenswrapper[5200]: > Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.128478 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-txf2x"] Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.179392 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0a47-account-create-update-n4bxl"] Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.213271 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-69qj9"] Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.387588 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-gsmtt"] Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.400750 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f459-account-create-update-rcg98"] Nov 26 13:50:34 crc kubenswrapper[5200]: W1126 13:50:34.495663 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb9e671_fdb0_40d8_ab5b_df7608f8be45.slice/crio-66c6141534245b8f711ae81fb0cf893fe98c1be3d368a299aa9625ca80fe529e WatchSource:0}: Error finding container 66c6141534245b8f711ae81fb0cf893fe98c1be3d368a299aa9625ca80fe529e: Status 404 returned error can't find the container with id 66c6141534245b8f711ae81fb0cf893fe98c1be3d368a299aa9625ca80fe529e Nov 26 13:50:34 crc kubenswrapper[5200]: W1126 13:50:34.500371 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0569f2fb_6ff0_4cf0_bb17_b2b6fd2a1550.slice/crio-25858d7f518b00aadf851df965943be7e94b145b68c10ba53611ce71ac83ecd3 WatchSource:0}: Error finding container 25858d7f518b00aadf851df965943be7e94b145b68c10ba53611ce71ac83ecd3: Status 404 returned error can't find the container with id 25858d7f518b00aadf851df965943be7e94b145b68c10ba53611ce71ac83ecd3 Nov 26 13:50:34 crc kubenswrapper[5200]: W1126 13:50:34.514266 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e4ee17b_ef5e_4dd6_aa2b_7ffa709193da.slice/crio-8d7efb40d44bec076c259c0ce2dadd37265daa07b15846507784e87d7acbd2bd WatchSource:0}: Error finding container 8d7efb40d44bec076c259c0ce2dadd37265daa07b15846507784e87d7acbd2bd: Status 404 returned error can't find the container with id 8d7efb40d44bec076c259c0ce2dadd37265daa07b15846507784e87d7acbd2bd Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.716734 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-69qj9" event={"ID":"4fb9e671-fdb0-40d8-ab5b-df7608f8be45","Type":"ContainerStarted","Data":"66c6141534245b8f711ae81fb0cf893fe98c1be3d368a299aa9625ca80fe529e"} Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.723865 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kp5s8-config-ghrxw" event={"ID":"f5980f0e-828c-494c-b17e-ddf80d7dc3a8","Type":"ContainerStarted","Data":"40d709137da6b2f53759103baa72cc3b0d8e67e3dc4e4e6014e7e28f64cad8aa"} Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.727792 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a47-account-create-update-n4bxl" event={"ID":"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550","Type":"ContainerStarted","Data":"25858d7f518b00aadf851df965943be7e94b145b68c10ba53611ce71ac83ecd3"} Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.738127 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gsmtt" event={"ID":"cb9efb7a-f26f-4635-9b6c-95a008eb05e0","Type":"ContainerStarted","Data":"684168378e79b4ec4ef7afa5a6cfd293d38b79e99ff9e4e0eaf302017d44bcf8"} Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.751572 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f459-account-create-update-rcg98" event={"ID":"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da","Type":"ContainerStarted","Data":"8d7efb40d44bec076c259c0ce2dadd37265daa07b15846507784e87d7acbd2bd"} Nov 26 13:50:34 crc kubenswrapper[5200]: I1126 13:50:34.766935 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-txf2x" event={"ID":"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a","Type":"ContainerStarted","Data":"951f7f8b9e3e39e2e3ff9f07a4e1f3d1e21e55f84b58c33c2c30688eedabb58d"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.062098 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.069854 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.074054 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.091403 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.162925 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kp5s8-config-ghrxw" podStartSLOduration=3.162894116 podStartE2EDuration="3.162894116s" podCreationTimestamp="2025-11-26 13:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:34.747863913 +0000 UTC m=+1203.427065751" watchObservedRunningTime="2025-11-26 13:50:35.162894116 +0000 UTC m=+1203.842095934" Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.176016 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7lcx4"] Nov 26 13:50:35 crc kubenswrapper[5200]: W1126 13:50:35.194940 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f0d85c7_7ad6_4e48_b3db_5c82ecfaf6ac.slice/crio-306d2934c390504fd249368d7d9ef643c3e10c7fbcf083ca4557e5ba43c625af WatchSource:0}: Error finding container 306d2934c390504fd249368d7d9ef643c3e10c7fbcf083ca4557e5ba43c625af: Status 404 returned error can't find the container with id 306d2934c390504fd249368d7d9ef643c3e10c7fbcf083ca4557e5ba43c625af Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.262737 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ca07-account-create-update-zddj2"] Nov 26 13:50:35 crc kubenswrapper[5200]: W1126 13:50:35.307101 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bbc6b47_81e8_4999_9cd8_1744a64e8158.slice/crio-9d712d15b82cf4d77f083f94e0cad23a4062665138c8dea8f85a60f908393a43 WatchSource:0}: Error finding container 9d712d15b82cf4d77f083f94e0cad23a4062665138c8dea8f85a60f908393a43: Status 404 returned error can't find the container with id 9d712d15b82cf4d77f083f94e0cad23a4062665138c8dea8f85a60f908393a43 Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.408954 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l8nht"] Nov 26 13:50:35 crc kubenswrapper[5200]: W1126 13:50:35.416941 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd3dd593_fa3c_491a_9f97_84f0dee04150.slice/crio-4366911bb3b29f7163785297dccfc2e3db580d9bd1202f0f45663714b4a2eb66 WatchSource:0}: Error finding container 4366911bb3b29f7163785297dccfc2e3db580d9bd1202f0f45663714b4a2eb66: Status 404 returned error can't find the container with id 4366911bb3b29f7163785297dccfc2e3db580d9bd1202f0f45663714b4a2eb66 Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.780387 5200 generic.go:358] "Generic (PLEG): container finished" podID="f5980f0e-828c-494c-b17e-ddf80d7dc3a8" containerID="40d709137da6b2f53759103baa72cc3b0d8e67e3dc4e4e6014e7e28f64cad8aa" exitCode=0 Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.780442 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kp5s8-config-ghrxw" event={"ID":"f5980f0e-828c-494c-b17e-ddf80d7dc3a8","Type":"ContainerDied","Data":"40d709137da6b2f53759103baa72cc3b0d8e67e3dc4e4e6014e7e28f64cad8aa"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.793653 5200 generic.go:358] "Generic (PLEG): container finished" podID="0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550" containerID="0d7570f872547bd971357e3ece39102ca1a3fdf8d21f961de3c227b7adf7f9d2" exitCode=0 Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.793930 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a47-account-create-update-n4bxl" event={"ID":"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550","Type":"ContainerDied","Data":"0d7570f872547bd971357e3ece39102ca1a3fdf8d21f961de3c227b7adf7f9d2"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.810154 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l8nht" event={"ID":"dd3dd593-fa3c-491a-9f97-84f0dee04150","Type":"ContainerStarted","Data":"4366911bb3b29f7163785297dccfc2e3db580d9bd1202f0f45663714b4a2eb66"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.815124 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7lcx4" event={"ID":"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac","Type":"ContainerStarted","Data":"d1b240a40098245ac1b2296169e9eed4b5def878e2367f4306844c7aea45ff74"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.815172 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7lcx4" event={"ID":"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac","Type":"ContainerStarted","Data":"306d2934c390504fd249368d7d9ef643c3e10c7fbcf083ca4557e5ba43c625af"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.817103 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f459-account-create-update-rcg98" event={"ID":"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da","Type":"ContainerStarted","Data":"1cb065ef5a3afa68f135814b127b312d334db2bcc6879e0f0a48d70b9a09a7d0"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.821647 5200 generic.go:358] "Generic (PLEG): container finished" podID="5179fdd0-fdef-4ddd-8d4f-0e13054dd47a" containerID="237e74b2a9070a9cdf1ca3c6faeff26ff070807b7b94c22a1a52300979212498" exitCode=0 Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.821835 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-txf2x" event={"ID":"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a","Type":"ContainerDied","Data":"237e74b2a9070a9cdf1ca3c6faeff26ff070807b7b94c22a1a52300979212498"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.829201 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ca07-account-create-update-zddj2" event={"ID":"2bbc6b47-81e8-4999-9cd8-1744a64e8158","Type":"ContainerStarted","Data":"3bb0c257014ca67f7e212e58afdefd5fd1976e3c93c5241454d4d104f8967837"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.829270 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ca07-account-create-update-zddj2" event={"ID":"2bbc6b47-81e8-4999-9cd8-1744a64e8158","Type":"ContainerStarted","Data":"9d712d15b82cf4d77f083f94e0cad23a4062665138c8dea8f85a60f908393a43"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.923597 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"3bfc3a40523ed141ac9ba98103ae3054008e91cd9185bc26c4c466d7974e3c46"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.923667 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"5c2fe7129b5e32d44a73bbe4f844f7bea44dd368fbaba824bda4a249bc55166e"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.923677 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"f63f390a689d7edab2c7f57aaab9b53c11b9ab447c109328c91f7c6d40c5f5ec"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.936412 5200 generic.go:358] "Generic (PLEG): container finished" podID="4fb9e671-fdb0-40d8-ab5b-df7608f8be45" containerID="ab5913c8e3d9589e5f92e5576e882b6f21b5df61dc31eccb2fadecbd632b03e7" exitCode=0 Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.936528 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-69qj9" event={"ID":"4fb9e671-fdb0-40d8-ab5b-df7608f8be45","Type":"ContainerDied","Data":"ab5913c8e3d9589e5f92e5576e882b6f21b5df61dc31eccb2fadecbd632b03e7"} Nov 26 13:50:35 crc kubenswrapper[5200]: I1126 13:50:35.941344 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ca07-account-create-update-zddj2" podStartSLOduration=2.941306321 podStartE2EDuration="2.941306321s" podCreationTimestamp="2025-11-26 13:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:35.932645411 +0000 UTC m=+1204.611847229" watchObservedRunningTime="2025-11-26 13:50:35.941306321 +0000 UTC m=+1204.620508159" Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.957273 5200 generic.go:358] "Generic (PLEG): container finished" podID="7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac" containerID="d1b240a40098245ac1b2296169e9eed4b5def878e2367f4306844c7aea45ff74" exitCode=0 Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.957431 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7lcx4" event={"ID":"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac","Type":"ContainerDied","Data":"d1b240a40098245ac1b2296169e9eed4b5def878e2367f4306844c7aea45ff74"} Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.962123 5200 generic.go:358] "Generic (PLEG): container finished" podID="1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da" containerID="1cb065ef5a3afa68f135814b127b312d334db2bcc6879e0f0a48d70b9a09a7d0" exitCode=0 Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.962235 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f459-account-create-update-rcg98" event={"ID":"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da","Type":"ContainerDied","Data":"1cb065ef5a3afa68f135814b127b312d334db2bcc6879e0f0a48d70b9a09a7d0"} Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.964823 5200 generic.go:358] "Generic (PLEG): container finished" podID="2bbc6b47-81e8-4999-9cd8-1744a64e8158" containerID="3bb0c257014ca67f7e212e58afdefd5fd1976e3c93c5241454d4d104f8967837" exitCode=0 Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.964925 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ca07-account-create-update-zddj2" event={"ID":"2bbc6b47-81e8-4999-9cd8-1744a64e8158","Type":"ContainerDied","Data":"3bb0c257014ca67f7e212e58afdefd5fd1976e3c93c5241454d4d104f8967837"} Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.996724 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"8a3c5bcf0c2ae9a26e17f253d1e64688670620c4dc1055a422aca31fb19825c3"} Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.996770 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"87a7e0f4bc1babfa0daa5d1e1ca5980686a23a89b61b6d2d33f6849a4eb58074"} Nov 26 13:50:36 crc kubenswrapper[5200]: I1126 13:50:36.996780 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"e400c5c5cf3eebf7e008c823d4b2464e8e01eeefe341a90c08ccbb54d5c9bc15"} Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.554493 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.573266 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.577028 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.582200 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.587523 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.596750 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sktm5\" (UniqueName: \"kubernetes.io/projected/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-kube-api-access-sktm5\") pod \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\" (UID: \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.596954 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-operator-scripts\") pod \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\" (UID: \"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.599166 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac" (UID: "7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.608971 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-kube-api-access-sktm5" (OuterVolumeSpecName: "kube-api-access-sktm5") pod "7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac" (UID: "7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac"). InnerVolumeSpecName "kube-api-access-sktm5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.610087 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.698979 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv4ks\" (UniqueName: \"kubernetes.io/projected/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-kube-api-access-lv4ks\") pod \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\" (UID: \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699033 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2snx\" (UniqueName: \"kubernetes.io/projected/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-kube-api-access-j2snx\") pod \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\" (UID: \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699130 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run-ovn\") pod \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699159 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-log-ovn\") pod \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699200 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-operator-scripts\") pod \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\" (UID: \"4fb9e671-fdb0-40d8-ab5b-df7608f8be45\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699222 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6srkr\" (UniqueName: \"kubernetes.io/projected/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-kube-api-access-6srkr\") pod \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\" (UID: \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699317 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-operator-scripts\") pod \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\" (UID: \"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699347 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-scripts\") pod \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699399 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run\") pod \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699413 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-operator-scripts\") pod \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\" (UID: \"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699492 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-additional-scripts\") pod \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699531 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpp5m\" (UniqueName: \"kubernetes.io/projected/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-kube-api-access-xpp5m\") pod \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\" (UID: \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699577 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24sbb\" (UniqueName: \"kubernetes.io/projected/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-kube-api-access-24sbb\") pod \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\" (UID: \"f5980f0e-828c-494c-b17e-ddf80d7dc3a8\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.699730 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-operator-scripts\") pod \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\" (UID: \"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550\") " Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.700071 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.700090 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sktm5\" (UniqueName: \"kubernetes.io/projected/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac-kube-api-access-sktm5\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.700224 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da" (UID: "1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.700277 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f5980f0e-828c-494c-b17e-ddf80d7dc3a8" (UID: "f5980f0e-828c-494c-b17e-ddf80d7dc3a8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.700342 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f5980f0e-828c-494c-b17e-ddf80d7dc3a8" (UID: "f5980f0e-828c-494c-b17e-ddf80d7dc3a8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.700935 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5179fdd0-fdef-4ddd-8d4f-0e13054dd47a" (UID: "5179fdd0-fdef-4ddd-8d4f-0e13054dd47a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.701122 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run" (OuterVolumeSpecName: "var-run") pod "f5980f0e-828c-494c-b17e-ddf80d7dc3a8" (UID: "f5980f0e-828c-494c-b17e-ddf80d7dc3a8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.701348 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fb9e671-fdb0-40d8-ab5b-df7608f8be45" (UID: "4fb9e671-fdb0-40d8-ab5b-df7608f8be45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.702098 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f5980f0e-828c-494c-b17e-ddf80d7dc3a8" (UID: "f5980f0e-828c-494c-b17e-ddf80d7dc3a8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.702400 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-scripts" (OuterVolumeSpecName: "scripts") pod "f5980f0e-828c-494c-b17e-ddf80d7dc3a8" (UID: "f5980f0e-828c-494c-b17e-ddf80d7dc3a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.702771 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550" (UID: "0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.704260 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-kube-api-access-6srkr" (OuterVolumeSpecName: "kube-api-access-6srkr") pod "5179fdd0-fdef-4ddd-8d4f-0e13054dd47a" (UID: "5179fdd0-fdef-4ddd-8d4f-0e13054dd47a"). InnerVolumeSpecName "kube-api-access-6srkr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.704988 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-kube-api-access-lv4ks" (OuterVolumeSpecName: "kube-api-access-lv4ks") pod "4fb9e671-fdb0-40d8-ab5b-df7608f8be45" (UID: "4fb9e671-fdb0-40d8-ab5b-df7608f8be45"). InnerVolumeSpecName "kube-api-access-lv4ks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.705213 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-kube-api-access-xpp5m" (OuterVolumeSpecName: "kube-api-access-xpp5m") pod "0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550" (UID: "0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550"). InnerVolumeSpecName "kube-api-access-xpp5m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.706097 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-kube-api-access-j2snx" (OuterVolumeSpecName: "kube-api-access-j2snx") pod "1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da" (UID: "1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da"). InnerVolumeSpecName "kube-api-access-j2snx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.706947 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-kube-api-access-24sbb" (OuterVolumeSpecName: "kube-api-access-24sbb") pod "f5980f0e-828c-494c-b17e-ddf80d7dc3a8" (UID: "f5980f0e-828c-494c-b17e-ddf80d7dc3a8"). InnerVolumeSpecName "kube-api-access-24sbb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.801942 5200 reconciler_common.go:299] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.801990 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpp5m\" (UniqueName: \"kubernetes.io/projected/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-kube-api-access-xpp5m\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802004 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24sbb\" (UniqueName: \"kubernetes.io/projected/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-kube-api-access-24sbb\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802015 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802023 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lv4ks\" (UniqueName: \"kubernetes.io/projected/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-kube-api-access-lv4ks\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802033 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j2snx\" (UniqueName: \"kubernetes.io/projected/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-kube-api-access-j2snx\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802043 5200 reconciler_common.go:299] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802052 5200 reconciler_common.go:299] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802059 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fb9e671-fdb0-40d8-ab5b-df7608f8be45-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802071 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6srkr\" (UniqueName: \"kubernetes.io/projected/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-kube-api-access-6srkr\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802079 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802089 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802100 5200 reconciler_common.go:299] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5980f0e-828c-494c-b17e-ddf80d7dc3a8-var-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.802109 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.879373 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kp5s8-config-ghrxw"] Nov 26 13:50:37 crc kubenswrapper[5200]: I1126 13:50:37.891014 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kp5s8-config-ghrxw"] Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.031040 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7lcx4" event={"ID":"7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac","Type":"ContainerDied","Data":"306d2934c390504fd249368d7d9ef643c3e10c7fbcf083ca4557e5ba43c625af"} Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.031102 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306d2934c390504fd249368d7d9ef643c3e10c7fbcf083ca4557e5ba43c625af" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.031194 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7lcx4" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.049604 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f459-account-create-update-rcg98" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.049617 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f459-account-create-update-rcg98" event={"ID":"1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da","Type":"ContainerDied","Data":"8d7efb40d44bec076c259c0ce2dadd37265daa07b15846507784e87d7acbd2bd"} Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.049664 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d7efb40d44bec076c259c0ce2dadd37265daa07b15846507784e87d7acbd2bd" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.052384 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-txf2x" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.052414 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-txf2x" event={"ID":"5179fdd0-fdef-4ddd-8d4f-0e13054dd47a","Type":"ContainerDied","Data":"951f7f8b9e3e39e2e3ff9f07a4e1f3d1e21e55f84b58c33c2c30688eedabb58d"} Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.052439 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="951f7f8b9e3e39e2e3ff9f07a4e1f3d1e21e55f84b58c33c2c30688eedabb58d" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.057720 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerStarted","Data":"d6a3d13e69838eb9d4252d0286848176a74b5a51be636cf9579e83eb743b5064"} Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.064732 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-69qj9" event={"ID":"4fb9e671-fdb0-40d8-ab5b-df7608f8be45","Type":"ContainerDied","Data":"66c6141534245b8f711ae81fb0cf893fe98c1be3d368a299aa9625ca80fe529e"} Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.064767 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-69qj9" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.064788 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66c6141534245b8f711ae81fb0cf893fe98c1be3d368a299aa9625ca80fe529e" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.066898 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de42233641bc98324f0b4a3590f9160c5fbaa0299d9bf9899725d467b76a2eab" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.066992 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kp5s8-config-ghrxw" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.074903 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a47-account-create-update-n4bxl" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.074906 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a47-account-create-update-n4bxl" event={"ID":"0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550","Type":"ContainerDied","Data":"25858d7f518b00aadf851df965943be7e94b145b68c10ba53611ce71ac83ecd3"} Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.075031 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25858d7f518b00aadf851df965943be7e94b145b68c10ba53611ce71ac83ecd3" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.111082 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.780502776 podStartE2EDuration="28.111058904s" podCreationTimestamp="2025-11-26 13:50:10 +0000 UTC" firstStartedPulling="2025-11-26 13:50:28.33648407 +0000 UTC m=+1197.015685888" lastFinishedPulling="2025-11-26 13:50:34.667040198 +0000 UTC m=+1203.346242016" observedRunningTime="2025-11-26 13:50:38.098723166 +0000 UTC m=+1206.777925004" watchObservedRunningTime="2025-11-26 13:50:38.111058904 +0000 UTC m=+1206.790260722" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.141316 5200 status_manager.go:895] "Failed to get status for pod" podUID="f5980f0e-828c-494c-b17e-ddf80d7dc3a8" pod="openstack/ovn-controller-kp5s8-config-ghrxw" err="pods \"ovn-controller-kp5s8-config-ghrxw\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.150005 5200 status_manager.go:895] "Failed to get status for pod" podUID="f5980f0e-828c-494c-b17e-ddf80d7dc3a8" pod="openstack/ovn-controller-kp5s8-config-ghrxw" err="pods \"ovn-controller-kp5s8-config-ghrxw\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.448491 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d8dd6d87-l29sm"] Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450556 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550" containerName="mariadb-account-create-update" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450580 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550" containerName="mariadb-account-create-update" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450641 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5179fdd0-fdef-4ddd-8d4f-0e13054dd47a" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450651 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5179fdd0-fdef-4ddd-8d4f-0e13054dd47a" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450701 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5980f0e-828c-494c-b17e-ddf80d7dc3a8" containerName="ovn-config" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450708 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5980f0e-828c-494c-b17e-ddf80d7dc3a8" containerName="ovn-config" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450720 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450726 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450741 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fb9e671-fdb0-40d8-ab5b-df7608f8be45" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450750 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb9e671-fdb0-40d8-ab5b-df7608f8be45" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450758 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da" containerName="mariadb-account-create-update" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450765 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da" containerName="mariadb-account-create-update" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450957 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5980f0e-828c-494c-b17e-ddf80d7dc3a8" containerName="ovn-config" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450972 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fb9e671-fdb0-40d8-ab5b-df7608f8be45" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450982 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5179fdd0-fdef-4ddd-8d4f-0e13054dd47a" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.450993 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550" containerName="mariadb-account-create-update" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.451006 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da" containerName="mariadb-account-create-update" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.451020 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac" containerName="mariadb-database-create" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.461647 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.466932 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"dns-swift-storage-0\"" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.487781 5200 status_manager.go:895] "Failed to get status for pod" podUID="f5980f0e-828c-494c-b17e-ddf80d7dc3a8" pod="openstack/ovn-controller-kp5s8-config-ghrxw" err="pods \"ovn-controller-kp5s8-config-ghrxw\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.490396 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8dd6d87-l29sm"] Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.520895 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-svc\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.520984 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-sb\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.521009 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-config\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.521106 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-swift-storage-0\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.521192 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-nb\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.521251 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4fg6\" (UniqueName: \"kubernetes.io/projected/9a99c1e3-b276-4a0d-a6be-db034d5570ad-kube-api-access-q4fg6\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.622970 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-svc\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.623427 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-sb\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.623450 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-config\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.623476 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-swift-storage-0\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.623501 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-nb\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.623523 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4fg6\" (UniqueName: \"kubernetes.io/projected/9a99c1e3-b276-4a0d-a6be-db034d5570ad-kube-api-access-q4fg6\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.624604 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-sb\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.625525 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-swift-storage-0\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.625643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-config\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.630511 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-nb\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.630712 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-svc\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.654434 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4fg6\" (UniqueName: \"kubernetes.io/projected/9a99c1e3-b276-4a0d-a6be-db034d5570ad-kube-api-access-q4fg6\") pod \"dnsmasq-dns-d8dd6d87-l29sm\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.789189 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:38 crc kubenswrapper[5200]: I1126 13:50:38.983113 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5980f0e-828c-494c-b17e-ddf80d7dc3a8" path="/var/lib/kubelet/pods/f5980f0e-828c-494c-b17e-ddf80d7dc3a8/volumes" Nov 26 13:50:39 crc kubenswrapper[5200]: I1126 13:50:39.103119 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-kp5s8" Nov 26 13:50:41 crc kubenswrapper[5200]: I1126 13:50:41.697463 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:41 crc kubenswrapper[5200]: I1126 13:50:41.803585 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n6fq\" (UniqueName: \"kubernetes.io/projected/2bbc6b47-81e8-4999-9cd8-1744a64e8158-kube-api-access-6n6fq\") pod \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\" (UID: \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\") " Nov 26 13:50:41 crc kubenswrapper[5200]: I1126 13:50:41.804236 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbc6b47-81e8-4999-9cd8-1744a64e8158-operator-scripts\") pod \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\" (UID: \"2bbc6b47-81e8-4999-9cd8-1744a64e8158\") " Nov 26 13:50:41 crc kubenswrapper[5200]: I1126 13:50:41.807349 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbc6b47-81e8-4999-9cd8-1744a64e8158-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bbc6b47-81e8-4999-9cd8-1744a64e8158" (UID: "2bbc6b47-81e8-4999-9cd8-1744a64e8158"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:41 crc kubenswrapper[5200]: I1126 13:50:41.815702 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbc6b47-81e8-4999-9cd8-1744a64e8158-kube-api-access-6n6fq" (OuterVolumeSpecName: "kube-api-access-6n6fq") pod "2bbc6b47-81e8-4999-9cd8-1744a64e8158" (UID: "2bbc6b47-81e8-4999-9cd8-1744a64e8158"). InnerVolumeSpecName "kube-api-access-6n6fq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:41 crc kubenswrapper[5200]: I1126 13:50:41.906828 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6n6fq\" (UniqueName: \"kubernetes.io/projected/2bbc6b47-81e8-4999-9cd8-1744a64e8158-kube-api-access-6n6fq\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:41 crc kubenswrapper[5200]: I1126 13:50:41.906865 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbc6b47-81e8-4999-9cd8-1744a64e8158-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:42 crc kubenswrapper[5200]: I1126 13:50:42.074394 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8dd6d87-l29sm"] Nov 26 13:50:42 crc kubenswrapper[5200]: I1126 13:50:42.128506 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gsmtt" event={"ID":"cb9efb7a-f26f-4635-9b6c-95a008eb05e0","Type":"ContainerStarted","Data":"dd515b54d795cf2f7c82f49b17aaf60ca06232d30e43913d279f5c017012b23a"} Nov 26 13:50:42 crc kubenswrapper[5200]: I1126 13:50:42.135111 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" event={"ID":"9a99c1e3-b276-4a0d-a6be-db034d5570ad","Type":"ContainerStarted","Data":"cc85f159936052bc36f9e0ff3f1da60f591d2fbdc7fa284f07a393aa7628e109"} Nov 26 13:50:42 crc kubenswrapper[5200]: I1126 13:50:42.142064 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ca07-account-create-update-zddj2" event={"ID":"2bbc6b47-81e8-4999-9cd8-1744a64e8158","Type":"ContainerDied","Data":"9d712d15b82cf4d77f083f94e0cad23a4062665138c8dea8f85a60f908393a43"} Nov 26 13:50:42 crc kubenswrapper[5200]: I1126 13:50:42.142107 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d712d15b82cf4d77f083f94e0cad23a4062665138c8dea8f85a60f908393a43" Nov 26 13:50:42 crc kubenswrapper[5200]: I1126 13:50:42.142144 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ca07-account-create-update-zddj2" Nov 26 13:50:42 crc kubenswrapper[5200]: I1126 13:50:42.155635 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-gsmtt" podStartSLOduration=2.054255593 podStartE2EDuration="9.155616594s" podCreationTimestamp="2025-11-26 13:50:33 +0000 UTC" firstStartedPulling="2025-11-26 13:50:34.490967276 +0000 UTC m=+1203.170169084" lastFinishedPulling="2025-11-26 13:50:41.592328257 +0000 UTC m=+1210.271530085" observedRunningTime="2025-11-26 13:50:42.148608808 +0000 UTC m=+1210.827810626" watchObservedRunningTime="2025-11-26 13:50:42.155616594 +0000 UTC m=+1210.834818402" Nov 26 13:50:42 crc kubenswrapper[5200]: E1126 13:50:42.538405 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1438184 actualBytes=10240 Nov 26 13:50:43 crc kubenswrapper[5200]: I1126 13:50:43.164489 5200 generic.go:358] "Generic (PLEG): container finished" podID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerID="88eb3cdf10e248b436583765aef8df88596886f0d9ee9cec3c92643836f79431" exitCode=0 Nov 26 13:50:43 crc kubenswrapper[5200]: I1126 13:50:43.165876 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" event={"ID":"9a99c1e3-b276-4a0d-a6be-db034d5570ad","Type":"ContainerDied","Data":"88eb3cdf10e248b436583765aef8df88596886f0d9ee9cec3c92643836f79431"} Nov 26 13:50:44 crc kubenswrapper[5200]: I1126 13:50:44.178631 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" event={"ID":"9a99c1e3-b276-4a0d-a6be-db034d5570ad","Type":"ContainerStarted","Data":"c185ea375244de32aa16f71ad246b32a82d3afd34cd6ef2754722091997711d2"} Nov 26 13:50:44 crc kubenswrapper[5200]: I1126 13:50:44.180210 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:44 crc kubenswrapper[5200]: I1126 13:50:44.205893 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" podStartSLOduration=6.205862816 podStartE2EDuration="6.205862816s" podCreationTimestamp="2025-11-26 13:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:44.201806018 +0000 UTC m=+1212.881007866" watchObservedRunningTime="2025-11-26 13:50:44.205862816 +0000 UTC m=+1212.885064644" Nov 26 13:50:46 crc kubenswrapper[5200]: I1126 13:50:46.231318 5200 generic.go:358] "Generic (PLEG): container finished" podID="cb9efb7a-f26f-4635-9b6c-95a008eb05e0" containerID="dd515b54d795cf2f7c82f49b17aaf60ca06232d30e43913d279f5c017012b23a" exitCode=0 Nov 26 13:50:46 crc kubenswrapper[5200]: I1126 13:50:46.231508 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gsmtt" event={"ID":"cb9efb7a-f26f-4635-9b6c-95a008eb05e0","Type":"ContainerDied","Data":"dd515b54d795cf2f7c82f49b17aaf60ca06232d30e43913d279f5c017012b23a"} Nov 26 13:50:50 crc kubenswrapper[5200]: I1126 13:50:50.838145 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:50 crc kubenswrapper[5200]: I1126 13:50:50.929519 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-config-data\") pod \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " Nov 26 13:50:50 crc kubenswrapper[5200]: I1126 13:50:50.929779 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2dcn\" (UniqueName: \"kubernetes.io/projected/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-kube-api-access-b2dcn\") pod \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " Nov 26 13:50:50 crc kubenswrapper[5200]: I1126 13:50:50.929867 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-combined-ca-bundle\") pod \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\" (UID: \"cb9efb7a-f26f-4635-9b6c-95a008eb05e0\") " Nov 26 13:50:50 crc kubenswrapper[5200]: I1126 13:50:50.935789 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-kube-api-access-b2dcn" (OuterVolumeSpecName: "kube-api-access-b2dcn") pod "cb9efb7a-f26f-4635-9b6c-95a008eb05e0" (UID: "cb9efb7a-f26f-4635-9b6c-95a008eb05e0"). InnerVolumeSpecName "kube-api-access-b2dcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:50 crc kubenswrapper[5200]: I1126 13:50:50.964898 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb9efb7a-f26f-4635-9b6c-95a008eb05e0" (UID: "cb9efb7a-f26f-4635-9b6c-95a008eb05e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:50:50 crc kubenswrapper[5200]: I1126 13:50:50.993025 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-config-data" (OuterVolumeSpecName: "config-data") pod "cb9efb7a-f26f-4635-9b6c-95a008eb05e0" (UID: "cb9efb7a-f26f-4635-9b6c-95a008eb05e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.033162 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2dcn\" (UniqueName: \"kubernetes.io/projected/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-kube-api-access-b2dcn\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.033201 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.033277 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9efb7a-f26f-4635-9b6c-95a008eb05e0-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.234837 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.301105 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb6f9b497-2587r"] Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.301453 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" podUID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" containerName="dnsmasq-dns" containerID="cri-o://590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78" gracePeriod=10 Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.318703 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gsmtt" Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.319105 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gsmtt" event={"ID":"cb9efb7a-f26f-4635-9b6c-95a008eb05e0","Type":"ContainerDied","Data":"684168378e79b4ec4ef7afa5a6cfd293d38b79e99ff9e4e0eaf302017d44bcf8"} Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.319159 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="684168378e79b4ec4ef7afa5a6cfd293d38b79e99ff9e4e0eaf302017d44bcf8" Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.765604 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.951290 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-sb\") pod \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.951523 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwf6g\" (UniqueName: \"kubernetes.io/projected/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-kube-api-access-gwf6g\") pod \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.951560 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-nb\") pod \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.951663 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-config\") pod \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.951785 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-dns-svc\") pod \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\" (UID: \"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2\") " Nov 26 13:50:51 crc kubenswrapper[5200]: I1126 13:50:51.959065 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-kube-api-access-gwf6g" (OuterVolumeSpecName: "kube-api-access-gwf6g") pod "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" (UID: "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2"). InnerVolumeSpecName "kube-api-access-gwf6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.044182 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" (UID: "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.046920 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-config" (OuterVolumeSpecName: "config") pod "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" (UID: "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.059453 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.059515 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.059540 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwf6g\" (UniqueName: \"kubernetes.io/projected/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-kube-api-access-gwf6g\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.061409 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" (UID: "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.070096 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" (UID: "59a26d9f-edf9-4a31-b95a-d4a0894a7ba2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.142795 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mzwfx"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163052 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163104 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163114 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bbc6b47-81e8-4999-9cd8-1744a64e8158" containerName="mariadb-account-create-update" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163154 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbc6b47-81e8-4999-9cd8-1744a64e8158" containerName="mariadb-account-create-update" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163220 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb9efb7a-f26f-4635-9b6c-95a008eb05e0" containerName="keystone-db-sync" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163226 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9efb7a-f26f-4635-9b6c-95a008eb05e0" containerName="keystone-db-sync" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163241 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" containerName="init" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163247 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" containerName="init" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163269 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" containerName="dnsmasq-dns" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163275 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" containerName="dnsmasq-dns" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163492 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" containerName="dnsmasq-dns" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163511 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bbc6b47-81e8-4999-9cd8-1744a64e8158" containerName="mariadb-account-create-update" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.163521 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb9efb7a-f26f-4635-9b6c-95a008eb05e0" containerName="keystone-db-sync" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.178547 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77cf58dbf9-sw7wq"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.190350 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mzwfx"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.190523 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.191193 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.201147 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.201562 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.201959 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-xzmjl\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.202136 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"osp-secret\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.202253 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.202635 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77cf58dbf9-sw7wq"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.341186 5200 generic.go:358] "Generic (PLEG): container finished" podID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" containerID="590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78" exitCode=0 Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.341438 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" event={"ID":"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2","Type":"ContainerDied","Data":"590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78"} Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.341481 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" event={"ID":"59a26d9f-edf9-4a31-b95a-d4a0894a7ba2","Type":"ContainerDied","Data":"58d50599bdced46d86ad26d090f80ed97666580fa5ae2106b87be80f0121b7c5"} Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.341507 5200 scope.go:117] "RemoveContainer" containerID="590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.341782 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb6f9b497-2587r" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.353633 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l8nht" event={"ID":"dd3dd593-fa3c-491a-9f97-84f0dee04150","Type":"ContainerStarted","Data":"a47711b2713db506a1c2b4c89cdd12db3f54982d27690fb2933e78650de8d05e"} Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.375416 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-config-data\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.375524 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-credential-keys\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.375575 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-svc\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.375600 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj8zb\" (UniqueName: \"kubernetes.io/projected/7af63af7-05af-4ee6-a3d1-fa8809059a3e-kube-api-access-wj8zb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.375648 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-config\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.375695 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.375964 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-combined-ca-bundle\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.375988 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.376010 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbnh2\" (UniqueName: \"kubernetes.io/projected/69013620-c0db-4231-8e9b-51427db2b25f-kube-api-access-tbnh2\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.376069 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-scripts\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.376108 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-fernet-keys\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.376183 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.389125 5200 scope.go:117] "RemoveContainer" containerID="c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.394259 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-l8nht" podStartSLOduration=4.131696036 podStartE2EDuration="19.39423193s" podCreationTimestamp="2025-11-26 13:50:33 +0000 UTC" firstStartedPulling="2025-11-26 13:50:35.419056283 +0000 UTC m=+1204.098258101" lastFinishedPulling="2025-11-26 13:50:50.681592177 +0000 UTC m=+1219.360793995" observedRunningTime="2025-11-26 13:50:52.373443049 +0000 UTC m=+1221.052644857" watchObservedRunningTime="2025-11-26 13:50:52.39423193 +0000 UTC m=+1221.073433748" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.413146 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.423185 5200 scope.go:117] "RemoveContainer" containerID="590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78" Nov 26 13:50:52 crc kubenswrapper[5200]: E1126 13:50:52.423940 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78\": container with ID starting with 590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78 not found: ID does not exist" containerID="590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.423996 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78"} err="failed to get container status \"590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78\": rpc error: code = NotFound desc = could not find container \"590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78\": container with ID starting with 590881d78d53e1792bf283f6bfbb3ff62405c36442b421e6e9b12f7eeabd4f78 not found: ID does not exist" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.424026 5200 scope.go:117] "RemoveContainer" containerID="c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b" Nov 26 13:50:52 crc kubenswrapper[5200]: E1126 13:50:52.424588 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b\": container with ID starting with c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b not found: ID does not exist" containerID="c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.424621 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b"} err="failed to get container status \"c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b\": rpc error: code = NotFound desc = could not find container \"c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b\": container with ID starting with c16af112178d1ef04b5d086b1e223796c59faba044ece9e12373142623cb2b8b not found: ID does not exist" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.431102 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb6f9b497-2587r"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.431278 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.433798 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.434808 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.434991 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.442245 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fb6f9b497-2587r"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.458941 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-g4hpw"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.472478 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vk528"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.477147 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.490980 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-config-data\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.493578 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scripts\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.494610 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-cinder-dockercfg-xx5md\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.514986 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.518493 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-combined-ca-bundle\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.518576 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.518620 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbnh2\" (UniqueName: \"kubernetes.io/projected/69013620-c0db-4231-8e9b-51427db2b25f-kube-api-access-tbnh2\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.518728 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.518781 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-scripts\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.518868 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-fernet-keys\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.519868 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-neutron-dockercfg-7kxzd\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.520283 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-config\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.521227 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-httpd-config\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.521597 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-scripts\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.521888 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.522126 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.522366 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-config-data\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.522467 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-log-httpd\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.522551 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-credential-keys\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.522673 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-svc\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.522799 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj8zb\" (UniqueName: \"kubernetes.io/projected/7af63af7-05af-4ee6-a3d1-fa8809059a3e-kube-api-access-wj8zb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.522887 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-run-httpd\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.522975 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-config\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.523085 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.523380 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpsjt\" (UniqueName: \"kubernetes.io/projected/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-kube-api-access-wpsjt\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.523452 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-config-data\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.525213 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-nb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.531017 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-svc\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.535384 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vk528"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.537231 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-sb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.541697 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-config\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.551736 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-swift-storage-0\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.566902 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-fernet-keys\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.567659 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-credential-keys\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.569358 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-scripts\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.575549 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-config-data\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.577748 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj8zb\" (UniqueName: \"kubernetes.io/projected/7af63af7-05af-4ee6-a3d1-fa8809059a3e-kube-api-access-wj8zb\") pod \"dnsmasq-dns-77cf58dbf9-sw7wq\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.588829 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-combined-ca-bundle\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.592829 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbnh2\" (UniqueName: \"kubernetes.io/projected/69013620-c0db-4231-8e9b-51427db2b25f-kube-api-access-tbnh2\") pod \"keystone-bootstrap-mzwfx\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.609276 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-pzmsj"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.633022 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.636854 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-config-data\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.637237 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-barbican-dockercfg-d26sf\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638111 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-log-httpd\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638172 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-etc-machine-id\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638193 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-run-httpd\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638238 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpsjt\" (UniqueName: \"kubernetes.io/projected/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-kube-api-access-wpsjt\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638256 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-config-data\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638320 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638347 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-scripts\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638369 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-config-data\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638397 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-combined-ca-bundle\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638419 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-scripts\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638440 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638463 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6pd\" (UniqueName: \"kubernetes.io/projected/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-kube-api-access-lx6pd\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.638495 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-db-sync-config-data\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.654619 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-run-httpd\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.655359 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-log-httpd\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.664519 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-scripts\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.673193 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.674318 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.674409 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g4hpw"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.677810 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-config-data\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.689900 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpsjt\" (UniqueName: \"kubernetes.io/projected/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-kube-api-access-wpsjt\") pod \"ceilometer-0\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.714831 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pzmsj"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.740381 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqpdm\" (UniqueName: \"kubernetes.io/projected/e1df6fce-e471-466b-b8fb-e21823bb3e3f-kube-api-access-cqpdm\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.740978 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-combined-ca-bundle\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741012 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-db-sync-config-data\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741038 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-scripts\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741067 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt788\" (UniqueName: \"kubernetes.io/projected/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-kube-api-access-rt788\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741087 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6pd\" (UniqueName: \"kubernetes.io/projected/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-kube-api-access-lx6pd\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741131 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-db-sync-config-data\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741168 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-etc-machine-id\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741236 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-combined-ca-bundle\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741291 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-config\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741327 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-combined-ca-bundle\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.741362 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-config-data\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.745994 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-config-data\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.746882 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-etc-machine-id\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.749529 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-qp469"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.751824 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-scripts\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.753627 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-combined-ca-bundle\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.758506 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-db-sync-config-data\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.763983 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cf58dbf9-sw7wq"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.764205 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qp469" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.764534 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6pd\" (UniqueName: \"kubernetes.io/projected/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-kube-api-access-lx6pd\") pod \"cinder-db-sync-g4hpw\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.770219 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-placement-dockercfg-nbggv\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.770294 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-scripts\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.770317 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-config-data\"" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.770966 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qp469"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.772181 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.784702 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65b496c59-vqks6"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.789951 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.796400 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65b496c59-vqks6"] Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.796569 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.823897 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.843312 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqpdm\" (UniqueName: \"kubernetes.io/projected/e1df6fce-e471-466b-b8fb-e21823bb3e3f-kube-api-access-cqpdm\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.843374 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-db-sync-config-data\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.843405 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt788\" (UniqueName: \"kubernetes.io/projected/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-kube-api-access-rt788\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.843486 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-combined-ca-bundle\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.843530 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-config\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.843561 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-combined-ca-bundle\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.850345 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-combined-ca-bundle\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.860489 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-combined-ca-bundle\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.860757 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-db-sync-config-data\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.863719 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.865012 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqpdm\" (UniqueName: \"kubernetes.io/projected/e1df6fce-e471-466b-b8fb-e21823bb3e3f-kube-api-access-cqpdm\") pod \"barbican-db-sync-pzmsj\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.868195 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt788\" (UniqueName: \"kubernetes.io/projected/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-kube-api-access-rt788\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.868757 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-config\") pod \"neutron-db-sync-vk528\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.875873 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vk528" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.946988 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-sb\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947039 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-combined-ca-bundle\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947061 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bngn8\" (UniqueName: \"kubernetes.io/projected/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-kube-api-access-bngn8\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947088 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-scripts\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947122 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-logs\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947167 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-config\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947200 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmq2\" (UniqueName: \"kubernetes.io/projected/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-kube-api-access-rgmq2\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947228 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-svc\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947261 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-nb\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947281 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-config-data\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.947310 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-swift-storage-0\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:52 crc kubenswrapper[5200]: I1126 13:50:52.999504 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59a26d9f-edf9-4a31-b95a-d4a0894a7ba2" path="/var/lib/kubelet/pods/59a26d9f-edf9-4a31-b95a-d4a0894a7ba2/volumes" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.029198 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.049720 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-scripts\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050023 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-logs\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050206 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-config\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050312 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmq2\" (UniqueName: \"kubernetes.io/projected/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-kube-api-access-rgmq2\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050379 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-svc\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050462 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-nb\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050477 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-logs\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050507 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-config-data\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050617 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-swift-storage-0\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.050764 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-sb\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.051569 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-nb\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.051600 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-combined-ca-bundle\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.051620 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bngn8\" (UniqueName: \"kubernetes.io/projected/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-kube-api-access-bngn8\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.052026 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-svc\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.053079 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-sb\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.053800 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-swift-storage-0\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.054334 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-config\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.056083 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-scripts\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.056765 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-config-data\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.059126 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-combined-ca-bundle\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.076100 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bngn8\" (UniqueName: \"kubernetes.io/projected/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-kube-api-access-bngn8\") pod \"placement-db-sync-qp469\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.076208 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmq2\" (UniqueName: \"kubernetes.io/projected/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-kube-api-access-rgmq2\") pod \"dnsmasq-dns-65b496c59-vqks6\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.113186 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qp469" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.127241 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.281751 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:50:53 crc kubenswrapper[5200]: W1126 13:50:53.307001 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d51f5b4_9537_4b72_b1c0_d7c26a94ffc4.slice/crio-2eec25b39db3b1e1e3b26ef78154f9c4728a68a4384476ce26c7dd8a211907bd WatchSource:0}: Error finding container 2eec25b39db3b1e1e3b26ef78154f9c4728a68a4384476ce26c7dd8a211907bd: Status 404 returned error can't find the container with id 2eec25b39db3b1e1e3b26ef78154f9c4728a68a4384476ce26c7dd8a211907bd Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.347974 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cf58dbf9-sw7wq"] Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.473339 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerStarted","Data":"2eec25b39db3b1e1e3b26ef78154f9c4728a68a4384476ce26c7dd8a211907bd"} Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.511374 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mzwfx"] Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.531523 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g4hpw"] Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.666264 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vk528"] Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.844571 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pzmsj"] Nov 26 13:50:53 crc kubenswrapper[5200]: W1126 13:50:53.876758 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1df6fce_e471_466b_b8fb_e21823bb3e3f.slice/crio-eb3e3f647ece2549bd4f736c65d15f6f12e67739053c31aefd0ceebc6e28cee0 WatchSource:0}: Error finding container eb3e3f647ece2549bd4f736c65d15f6f12e67739053c31aefd0ceebc6e28cee0: Status 404 returned error can't find the container with id eb3e3f647ece2549bd4f736c65d15f6f12e67739053c31aefd0ceebc6e28cee0 Nov 26 13:50:53 crc kubenswrapper[5200]: W1126 13:50:53.965705 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b8f8610_54ec_4b91_b84e_3b9d424ecc17.slice/crio-1e0f1d3f726d70b542392dfd84a752716e0a9e65f819dda0002355cae0f034d9 WatchSource:0}: Error finding container 1e0f1d3f726d70b542392dfd84a752716e0a9e65f819dda0002355cae0f034d9: Status 404 returned error can't find the container with id 1e0f1d3f726d70b542392dfd84a752716e0a9e65f819dda0002355cae0f034d9 Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.970125 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65b496c59-vqks6"] Nov 26 13:50:53 crc kubenswrapper[5200]: W1126 13:50:53.972971 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd085e2b_49e8_4b11_ac6b_4aa8c968a2ae.slice/crio-70e29797f166e2c208f0fb93ef19ab1b56592d635c5ee7b6d3f5d6f3590d8ae6 WatchSource:0}: Error finding container 70e29797f166e2c208f0fb93ef19ab1b56592d635c5ee7b6d3f5d6f3590d8ae6: Status 404 returned error can't find the container with id 70e29797f166e2c208f0fb93ef19ab1b56592d635c5ee7b6d3f5d6f3590d8ae6 Nov 26 13:50:53 crc kubenswrapper[5200]: I1126 13:50:53.988568 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qp469"] Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.500575 5200 generic.go:358] "Generic (PLEG): container finished" podID="7af63af7-05af-4ee6-a3d1-fa8809059a3e" containerID="ec7369f9fda399de378752dbc037d6ded5c2e714a8c04eba43b22e47c90f232d" exitCode=0 Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.501417 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" event={"ID":"7af63af7-05af-4ee6-a3d1-fa8809059a3e","Type":"ContainerDied","Data":"ec7369f9fda399de378752dbc037d6ded5c2e714a8c04eba43b22e47c90f232d"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.501475 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" event={"ID":"7af63af7-05af-4ee6-a3d1-fa8809059a3e","Type":"ContainerStarted","Data":"f1c89ca3c69f61b2de13252afbbf99cbd81e682c3ed98b997f1590879145f290"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.504039 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qp469" event={"ID":"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae","Type":"ContainerStarted","Data":"70e29797f166e2c208f0fb93ef19ab1b56592d635c5ee7b6d3f5d6f3590d8ae6"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.505679 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mzwfx" event={"ID":"69013620-c0db-4231-8e9b-51427db2b25f","Type":"ContainerStarted","Data":"6780bed773fef926c2ee46e41b04518f7f0797ec609aa5198d0c453994b1aebc"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.505811 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mzwfx" event={"ID":"69013620-c0db-4231-8e9b-51427db2b25f","Type":"ContainerStarted","Data":"560a1a02f569747768518e0dd8049d62963f316a8dad279908eff74f471ce3da"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.510326 5200 generic.go:358] "Generic (PLEG): container finished" podID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" containerID="3081debc234d9bb17fae81e348b82c22922224e36ab98e815e0aee6a0bc6b83f" exitCode=0 Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.510500 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b496c59-vqks6" event={"ID":"4b8f8610-54ec-4b91-b84e-3b9d424ecc17","Type":"ContainerDied","Data":"3081debc234d9bb17fae81e348b82c22922224e36ab98e815e0aee6a0bc6b83f"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.510580 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b496c59-vqks6" event={"ID":"4b8f8610-54ec-4b91-b84e-3b9d424ecc17","Type":"ContainerStarted","Data":"1e0f1d3f726d70b542392dfd84a752716e0a9e65f819dda0002355cae0f034d9"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.512548 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vk528" event={"ID":"4ce2de04-4f6b-49fa-9946-5ef9663e95e8","Type":"ContainerStarted","Data":"76f5da8064003e6153826b45a00197cff1cb02daddb956c84a81e5f15f4da5d8"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.512581 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vk528" event={"ID":"4ce2de04-4f6b-49fa-9946-5ef9663e95e8","Type":"ContainerStarted","Data":"d687e5b744ef8cb6797d83388d34a99606f25e5d74a192459b1352276f2fefcb"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.514259 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pzmsj" event={"ID":"e1df6fce-e471-466b-b8fb-e21823bb3e3f","Type":"ContainerStarted","Data":"eb3e3f647ece2549bd4f736c65d15f6f12e67739053c31aefd0ceebc6e28cee0"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.522391 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g4hpw" event={"ID":"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01","Type":"ContainerStarted","Data":"4c05765318b302e9083463f6cc5c9c8f10879179cd6a45826b61ade40cad74b0"} Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.552988 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vk528" podStartSLOduration=2.552949951 podStartE2EDuration="2.552949951s" podCreationTimestamp="2025-11-26 13:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:54.545177885 +0000 UTC m=+1223.224379703" watchObservedRunningTime="2025-11-26 13:50:54.552949951 +0000 UTC m=+1223.232151769" Nov 26 13:50:54 crc kubenswrapper[5200]: I1126 13:50:54.615258 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mzwfx" podStartSLOduration=2.6152390739999998 podStartE2EDuration="2.615239074s" podCreationTimestamp="2025-11-26 13:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:54.610432937 +0000 UTC m=+1223.289634765" watchObservedRunningTime="2025-11-26 13:50:54.615239074 +0000 UTC m=+1223.294440882" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.032800 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.123178 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-config\") pod \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.123282 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj8zb\" (UniqueName: \"kubernetes.io/projected/7af63af7-05af-4ee6-a3d1-fa8809059a3e-kube-api-access-wj8zb\") pod \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.123325 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-svc\") pod \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.123347 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-swift-storage-0\") pod \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.123394 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-nb\") pod \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.123519 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-sb\") pod \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\" (UID: \"7af63af7-05af-4ee6-a3d1-fa8809059a3e\") " Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.143465 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af63af7-05af-4ee6-a3d1-fa8809059a3e-kube-api-access-wj8zb" (OuterVolumeSpecName: "kube-api-access-wj8zb") pod "7af63af7-05af-4ee6-a3d1-fa8809059a3e" (UID: "7af63af7-05af-4ee6-a3d1-fa8809059a3e"). InnerVolumeSpecName "kube-api-access-wj8zb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.163661 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7af63af7-05af-4ee6-a3d1-fa8809059a3e" (UID: "7af63af7-05af-4ee6-a3d1-fa8809059a3e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.163778 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-config" (OuterVolumeSpecName: "config") pod "7af63af7-05af-4ee6-a3d1-fa8809059a3e" (UID: "7af63af7-05af-4ee6-a3d1-fa8809059a3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.182507 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7af63af7-05af-4ee6-a3d1-fa8809059a3e" (UID: "7af63af7-05af-4ee6-a3d1-fa8809059a3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.185514 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7af63af7-05af-4ee6-a3d1-fa8809059a3e" (UID: "7af63af7-05af-4ee6-a3d1-fa8809059a3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.189233 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7af63af7-05af-4ee6-a3d1-fa8809059a3e" (UID: "7af63af7-05af-4ee6-a3d1-fa8809059a3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.226786 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.226829 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wj8zb\" (UniqueName: \"kubernetes.io/projected/7af63af7-05af-4ee6-a3d1-fa8809059a3e-kube-api-access-wj8zb\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.226839 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.226848 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.226857 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.226865 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af63af7-05af-4ee6-a3d1-fa8809059a3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.540836 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.540852 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cf58dbf9-sw7wq" event={"ID":"7af63af7-05af-4ee6-a3d1-fa8809059a3e","Type":"ContainerDied","Data":"f1c89ca3c69f61b2de13252afbbf99cbd81e682c3ed98b997f1590879145f290"} Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.540999 5200 scope.go:117] "RemoveContainer" containerID="ec7369f9fda399de378752dbc037d6ded5c2e714a8c04eba43b22e47c90f232d" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.548749 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b496c59-vqks6" event={"ID":"4b8f8610-54ec-4b91-b84e-3b9d424ecc17","Type":"ContainerStarted","Data":"c8b425efed21b324dc0eb3805e5722eb74833f7d41b5a287617a6da6c2411594"} Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.548800 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.575576 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65b496c59-vqks6" podStartSLOduration=3.5755524149999998 podStartE2EDuration="3.575552415s" podCreationTimestamp="2025-11-26 13:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:50:55.57385939 +0000 UTC m=+1224.253061218" watchObservedRunningTime="2025-11-26 13:50:55.575552415 +0000 UTC m=+1224.254754233" Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.647051 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cf58dbf9-sw7wq"] Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.712215 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77cf58dbf9-sw7wq"] Nov 26 13:50:55 crc kubenswrapper[5200]: I1126 13:50:55.819189 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:50:56 crc kubenswrapper[5200]: I1126 13:50:56.984422 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af63af7-05af-4ee6-a3d1-fa8809059a3e" path="/var/lib/kubelet/pods/7af63af7-05af-4ee6-a3d1-fa8809059a3e/volumes" Nov 26 13:50:58 crc kubenswrapper[5200]: I1126 13:50:58.591275 5200 generic.go:358] "Generic (PLEG): container finished" podID="69013620-c0db-4231-8e9b-51427db2b25f" containerID="6780bed773fef926c2ee46e41b04518f7f0797ec609aa5198d0c453994b1aebc" exitCode=0 Nov 26 13:50:58 crc kubenswrapper[5200]: I1126 13:50:58.591388 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mzwfx" event={"ID":"69013620-c0db-4231-8e9b-51427db2b25f","Type":"ContainerDied","Data":"6780bed773fef926c2ee46e41b04518f7f0797ec609aa5198d0c453994b1aebc"} Nov 26 13:51:01 crc kubenswrapper[5200]: I1126 13:51:01.569933 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:51:01 crc kubenswrapper[5200]: I1126 13:51:01.692056 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8dd6d87-l29sm"] Nov 26 13:51:01 crc kubenswrapper[5200]: I1126 13:51:01.692364 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerName="dnsmasq-dns" containerID="cri-o://c185ea375244de32aa16f71ad246b32a82d3afd34cd6ef2754722091997711d2" gracePeriod=10 Nov 26 13:51:02 crc kubenswrapper[5200]: I1126 13:51:02.664848 5200 generic.go:358] "Generic (PLEG): container finished" podID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerID="c185ea375244de32aa16f71ad246b32a82d3afd34cd6ef2754722091997711d2" exitCode=0 Nov 26 13:51:02 crc kubenswrapper[5200]: I1126 13:51:02.664940 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" event={"ID":"9a99c1e3-b276-4a0d-a6be-db034d5570ad","Type":"ContainerDied","Data":"c185ea375244de32aa16f71ad246b32a82d3afd34cd6ef2754722091997711d2"} Nov 26 13:51:02 crc kubenswrapper[5200]: I1126 13:51:02.669587 5200 generic.go:358] "Generic (PLEG): container finished" podID="dd3dd593-fa3c-491a-9f97-84f0dee04150" containerID="a47711b2713db506a1c2b4c89cdd12db3f54982d27690fb2933e78650de8d05e" exitCode=0 Nov 26 13:51:02 crc kubenswrapper[5200]: I1126 13:51:02.669673 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l8nht" event={"ID":"dd3dd593-fa3c-491a-9f97-84f0dee04150","Type":"ContainerDied","Data":"a47711b2713db506a1c2b4c89cdd12db3f54982d27690fb2933e78650de8d05e"} Nov 26 13:51:03 crc kubenswrapper[5200]: I1126 13:51:03.155055 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:51:03 crc kubenswrapper[5200]: I1126 13:51:03.155456 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:51:03 crc kubenswrapper[5200]: I1126 13:51:03.155597 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:51:03 crc kubenswrapper[5200]: I1126 13:51:03.156638 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9374219ae715a3e96430de37c40ed96a5c875650a0a5602a0ef3886cef886129"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 13:51:03 crc kubenswrapper[5200]: I1126 13:51:03.156805 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://9374219ae715a3e96430de37c40ed96a5c875650a0a5602a0ef3886cef886129" gracePeriod=600 Nov 26 13:51:03 crc kubenswrapper[5200]: I1126 13:51:03.687530 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="9374219ae715a3e96430de37c40ed96a5c875650a0a5602a0ef3886cef886129" exitCode=0 Nov 26 13:51:03 crc kubenswrapper[5200]: I1126 13:51:03.687776 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"9374219ae715a3e96430de37c40ed96a5c875650a0a5602a0ef3886cef886129"} Nov 26 13:51:03 crc kubenswrapper[5200]: I1126 13:51:03.689197 5200 scope.go:117] "RemoveContainer" containerID="499ed0801b3e77071f6b5d1fa3f9f5b7ce1e53c1e78e3fecf97c942228f0d688" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.155299 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.291651 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-scripts\") pod \"69013620-c0db-4231-8e9b-51427db2b25f\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.291912 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-config-data\") pod \"69013620-c0db-4231-8e9b-51427db2b25f\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.293115 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-combined-ca-bundle\") pod \"69013620-c0db-4231-8e9b-51427db2b25f\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.293274 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-fernet-keys\") pod \"69013620-c0db-4231-8e9b-51427db2b25f\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.293362 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-credential-keys\") pod \"69013620-c0db-4231-8e9b-51427db2b25f\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.293594 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbnh2\" (UniqueName: \"kubernetes.io/projected/69013620-c0db-4231-8e9b-51427db2b25f-kube-api-access-tbnh2\") pod \"69013620-c0db-4231-8e9b-51427db2b25f\" (UID: \"69013620-c0db-4231-8e9b-51427db2b25f\") " Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.302222 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "69013620-c0db-4231-8e9b-51427db2b25f" (UID: "69013620-c0db-4231-8e9b-51427db2b25f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.303251 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-scripts" (OuterVolumeSpecName: "scripts") pod "69013620-c0db-4231-8e9b-51427db2b25f" (UID: "69013620-c0db-4231-8e9b-51427db2b25f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.304456 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69013620-c0db-4231-8e9b-51427db2b25f-kube-api-access-tbnh2" (OuterVolumeSpecName: "kube-api-access-tbnh2") pod "69013620-c0db-4231-8e9b-51427db2b25f" (UID: "69013620-c0db-4231-8e9b-51427db2b25f"). InnerVolumeSpecName "kube-api-access-tbnh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.305538 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "69013620-c0db-4231-8e9b-51427db2b25f" (UID: "69013620-c0db-4231-8e9b-51427db2b25f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.338882 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69013620-c0db-4231-8e9b-51427db2b25f" (UID: "69013620-c0db-4231-8e9b-51427db2b25f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.344022 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-config-data" (OuterVolumeSpecName: "config-data") pod "69013620-c0db-4231-8e9b-51427db2b25f" (UID: "69013620-c0db-4231-8e9b-51427db2b25f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.396056 5200 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.396094 5200 reconciler_common.go:299] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.396109 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbnh2\" (UniqueName: \"kubernetes.io/projected/69013620-c0db-4231-8e9b-51427db2b25f-kube-api-access-tbnh2\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.396118 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.396130 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.396144 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69013620-c0db-4231-8e9b-51427db2b25f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.714083 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mzwfx" event={"ID":"69013620-c0db-4231-8e9b-51427db2b25f","Type":"ContainerDied","Data":"560a1a02f569747768518e0dd8049d62963f316a8dad279908eff74f471ce3da"} Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.714150 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="560a1a02f569747768518e0dd8049d62963f316a8dad279908eff74f471ce3da" Nov 26 13:51:05 crc kubenswrapper[5200]: I1126 13:51:05.714240 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mzwfx" Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.261847 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mzwfx"] Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.278044 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mzwfx"] Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.346955 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x7qzt"] Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.349479 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69013620-c0db-4231-8e9b-51427db2b25f" containerName="keystone-bootstrap" Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.349507 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="69013620-c0db-4231-8e9b-51427db2b25f" containerName="keystone-bootstrap" Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.349555 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7af63af7-05af-4ee6-a3d1-fa8809059a3e" containerName="init" Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.349562 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af63af7-05af-4ee6-a3d1-fa8809059a3e" containerName="init" Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.349775 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="69013620-c0db-4231-8e9b-51427db2b25f" containerName="keystone-bootstrap" Nov 26 13:51:06 crc kubenswrapper[5200]: I1126 13:51:06.349792 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7af63af7-05af-4ee6-a3d1-fa8809059a3e" containerName="init" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.590080 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.595910 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-xzmjl\"" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.595949 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.597657 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.597803 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"osp-secret\"" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.597977 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.614275 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69013620-c0db-4231-8e9b-51427db2b25f" path="/var/lib/kubelet/pods/69013620-c0db-4231-8e9b-51427db2b25f/volumes" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.617546 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x7qzt"] Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.626459 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-credential-keys\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.626742 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27dq\" (UniqueName: \"kubernetes.io/projected/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-kube-api-access-t27dq\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.626914 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-combined-ca-bundle\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.627043 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-fernet-keys\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.627466 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-config-data\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.627817 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-scripts\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.731701 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-scripts\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.731768 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-credential-keys\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.731926 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t27dq\" (UniqueName: \"kubernetes.io/projected/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-kube-api-access-t27dq\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.732000 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-combined-ca-bundle\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.732032 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-fernet-keys\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.732108 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-config-data\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.744181 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-credential-keys\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.744712 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-combined-ca-bundle\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.744857 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-scripts\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.745483 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-fernet-keys\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.753609 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-config-data\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.770826 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27dq\" (UniqueName: \"kubernetes.io/projected/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-kube-api-access-t27dq\") pod \"keystone-bootstrap-x7qzt\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:10 crc kubenswrapper[5200]: I1126 13:51:10.928430 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:11 crc kubenswrapper[5200]: I1126 13:51:11.234087 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Nov 26 13:51:13 crc kubenswrapper[5200]: I1126 13:51:13.583835 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.381862 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.394963 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l8nht" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.512947 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-config-data\") pod \"dd3dd593-fa3c-491a-9f97-84f0dee04150\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.513071 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9fms\" (UniqueName: \"kubernetes.io/projected/dd3dd593-fa3c-491a-9f97-84f0dee04150-kube-api-access-p9fms\") pod \"dd3dd593-fa3c-491a-9f97-84f0dee04150\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.513141 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-combined-ca-bundle\") pod \"dd3dd593-fa3c-491a-9f97-84f0dee04150\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.513232 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-nb\") pod \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.513270 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-swift-storage-0\") pod \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.513326 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4fg6\" (UniqueName: \"kubernetes.io/projected/9a99c1e3-b276-4a0d-a6be-db034d5570ad-kube-api-access-q4fg6\") pod \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.513365 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-db-sync-config-data\") pod \"dd3dd593-fa3c-491a-9f97-84f0dee04150\" (UID: \"dd3dd593-fa3c-491a-9f97-84f0dee04150\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.513609 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-sb\") pod \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.514107 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-config\") pod \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.514159 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-svc\") pod \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\" (UID: \"9a99c1e3-b276-4a0d-a6be-db034d5570ad\") " Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.521127 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3dd593-fa3c-491a-9f97-84f0dee04150-kube-api-access-p9fms" (OuterVolumeSpecName: "kube-api-access-p9fms") pod "dd3dd593-fa3c-491a-9f97-84f0dee04150" (UID: "dd3dd593-fa3c-491a-9f97-84f0dee04150"). InnerVolumeSpecName "kube-api-access-p9fms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.522402 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dd3dd593-fa3c-491a-9f97-84f0dee04150" (UID: "dd3dd593-fa3c-491a-9f97-84f0dee04150"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.523142 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a99c1e3-b276-4a0d-a6be-db034d5570ad-kube-api-access-q4fg6" (OuterVolumeSpecName: "kube-api-access-q4fg6") pod "9a99c1e3-b276-4a0d-a6be-db034d5570ad" (UID: "9a99c1e3-b276-4a0d-a6be-db034d5570ad"). InnerVolumeSpecName "kube-api-access-q4fg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.566145 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd3dd593-fa3c-491a-9f97-84f0dee04150" (UID: "dd3dd593-fa3c-491a-9f97-84f0dee04150"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.572477 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a99c1e3-b276-4a0d-a6be-db034d5570ad" (UID: "9a99c1e3-b276-4a0d-a6be-db034d5570ad"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.573483 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a99c1e3-b276-4a0d-a6be-db034d5570ad" (UID: "9a99c1e3-b276-4a0d-a6be-db034d5570ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.574527 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a99c1e3-b276-4a0d-a6be-db034d5570ad" (UID: "9a99c1e3-b276-4a0d-a6be-db034d5570ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.583228 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-config" (OuterVolumeSpecName: "config") pod "9a99c1e3-b276-4a0d-a6be-db034d5570ad" (UID: "9a99c1e3-b276-4a0d-a6be-db034d5570ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.587569 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a99c1e3-b276-4a0d-a6be-db034d5570ad" (UID: "9a99c1e3-b276-4a0d-a6be-db034d5570ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.590086 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-config-data" (OuterVolumeSpecName: "config-data") pod "dd3dd593-fa3c-491a-9f97-84f0dee04150" (UID: "dd3dd593-fa3c-491a-9f97-84f0dee04150"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617116 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617194 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617207 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617219 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617230 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9fms\" (UniqueName: \"kubernetes.io/projected/dd3dd593-fa3c-491a-9f97-84f0dee04150-kube-api-access-p9fms\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617241 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617254 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617264 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a99c1e3-b276-4a0d-a6be-db034d5570ad-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617275 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4fg6\" (UniqueName: \"kubernetes.io/projected/9a99c1e3-b276-4a0d-a6be-db034d5570ad-kube-api-access-q4fg6\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.617287 5200 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd3dd593-fa3c-491a-9f97-84f0dee04150-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.819472 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l8nht" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.821305 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l8nht" event={"ID":"dd3dd593-fa3c-491a-9f97-84f0dee04150","Type":"ContainerDied","Data":"4366911bb3b29f7163785297dccfc2e3db580d9bd1202f0f45663714b4a2eb66"} Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.821376 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4366911bb3b29f7163785297dccfc2e3db580d9bd1202f0f45663714b4a2eb66" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.824058 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" event={"ID":"9a99c1e3-b276-4a0d-a6be-db034d5570ad","Type":"ContainerDied","Data":"cc85f159936052bc36f9e0ff3f1da60f591d2fbdc7fa284f07a393aa7628e109"} Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.824229 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.875460 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8dd6d87-l29sm"] Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.884308 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d8dd6d87-l29sm"] Nov 26 13:51:14 crc kubenswrapper[5200]: I1126 13:51:14.985048 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" path="/var/lib/kubelet/pods/9a99c1e3-b276-4a0d-a6be-db034d5570ad/volumes" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.486171 5200 scope.go:117] "RemoveContainer" containerID="c185ea375244de32aa16f71ad246b32a82d3afd34cd6ef2754722091997711d2" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.631770 5200 scope.go:117] "RemoveContainer" containerID="88eb3cdf10e248b436583765aef8df88596886f0d9ee9cec3c92643836f79431" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.967194 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79f494c55f-m592t"] Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.968909 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerName="dnsmasq-dns" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.968928 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerName="dnsmasq-dns" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.968964 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd3dd593-fa3c-491a-9f97-84f0dee04150" containerName="glance-db-sync" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.968973 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3dd593-fa3c-491a-9f97-84f0dee04150" containerName="glance-db-sync" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.969029 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerName="init" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.969035 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerName="init" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.969180 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd3dd593-fa3c-491a-9f97-84f0dee04150" containerName="glance-db-sync" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.969195 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerName="dnsmasq-dns" Nov 26 13:51:15 crc kubenswrapper[5200]: I1126 13:51:15.990226 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:15.998872 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f494c55f-m592t"] Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.066705 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-nb\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.066784 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md6fr\" (UniqueName: \"kubernetes.io/projected/1247eec6-f811-4028-9e60-ee2e95526022-kube-api-access-md6fr\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.066813 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-swift-storage-0\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.066841 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-sb\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.066879 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-config\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.066985 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-svc\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.161615 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x7qzt"] Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.181048 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-nb\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.181144 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md6fr\" (UniqueName: \"kubernetes.io/projected/1247eec6-f811-4028-9e60-ee2e95526022-kube-api-access-md6fr\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.181188 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-swift-storage-0\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.181207 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-sb\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.181264 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-config\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.181477 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-svc\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.183036 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-nb\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.184233 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-sb\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.184422 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-swift-storage-0\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.188806 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-config\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.191103 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-svc\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.227337 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md6fr\" (UniqueName: \"kubernetes.io/projected/1247eec6-f811-4028-9e60-ee2e95526022-kube-api-access-md6fr\") pod \"dnsmasq-dns-79f494c55f-m592t\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.235108 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d8dd6d87-l29sm" podUID="9a99c1e3-b276-4a0d-a6be-db034d5570ad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.345232 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.756316 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f494c55f-m592t"] Nov 26 13:51:16 crc kubenswrapper[5200]: W1126 13:51:16.810049 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1247eec6_f811_4028_9e60_ee2e95526022.slice/crio-3cdc738aab6e7e6721e475a00cf1a75816bbff8b486855e51f17c0e91e9dd5f9 WatchSource:0}: Error finding container 3cdc738aab6e7e6721e475a00cf1a75816bbff8b486855e51f17c0e91e9dd5f9: Status 404 returned error can't find the container with id 3cdc738aab6e7e6721e475a00cf1a75816bbff8b486855e51f17c0e91e9dd5f9 Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.878873 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.889936 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.897215 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.897474 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-q4wxh\"" Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.903268 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:16 crc kubenswrapper[5200]: I1126 13:51:16.904454 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-scripts\"" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.010396 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7qzt" event={"ID":"8e6b8c32-f5d6-4f06-932a-81c158db7a8c","Type":"ContainerStarted","Data":"415457488fffda18bdfc53b0bac2a90c6c9fe3bafcf3620cbd35fa9928bde6da"} Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.010438 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7qzt" event={"ID":"8e6b8c32-f5d6-4f06-932a-81c158db7a8c","Type":"ContainerStarted","Data":"cee942d036fd563f8def2182d2a82c6296e5c5827fd535493420c50df4766966"} Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.015224 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.015283 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-logs\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.015305 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bpzj\" (UniqueName: \"kubernetes.io/projected/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-kube-api-access-7bpzj\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.015344 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.015389 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-scripts\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.015445 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-config-data\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.015462 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.029541 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pzmsj" event={"ID":"e1df6fce-e471-466b-b8fb-e21823bb3e3f","Type":"ContainerStarted","Data":"9bc8f43279d7dbd4a5978975de7b27caa4051fc1687a7d4d38d84f361032231d"} Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.048087 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x7qzt" podStartSLOduration=11.048058663 podStartE2EDuration="11.048058663s" podCreationTimestamp="2025-11-26 13:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:17.031171567 +0000 UTC m=+1245.710373385" watchObservedRunningTime="2025-11-26 13:51:17.048058663 +0000 UTC m=+1245.727260481" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.056215 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"8c9e2fd96d697c529382ab47187145fbd161a7e2364741d5b8b26afa12ecb0a5"} Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.061457 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-pzmsj" podStartSLOduration=3.406916072 podStartE2EDuration="25.061427377s" podCreationTimestamp="2025-11-26 13:50:52 +0000 UTC" firstStartedPulling="2025-11-26 13:50:53.882982764 +0000 UTC m=+1222.562184602" lastFinishedPulling="2025-11-26 13:51:15.537494049 +0000 UTC m=+1244.216695907" observedRunningTime="2025-11-26 13:51:17.049392759 +0000 UTC m=+1245.728594577" watchObservedRunningTime="2025-11-26 13:51:17.061427377 +0000 UTC m=+1245.740629195" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.063964 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerStarted","Data":"0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef"} Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.066523 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f494c55f-m592t" event={"ID":"1247eec6-f811-4028-9e60-ee2e95526022","Type":"ContainerStarted","Data":"3cdc738aab6e7e6721e475a00cf1a75816bbff8b486855e51f17c0e91e9dd5f9"} Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.074578 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qp469" event={"ID":"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae","Type":"ContainerStarted","Data":"367602e761bf597b8df1b7576ea8c9760abfec581d9ad4e3e0f5abcc608fde62"} Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.118009 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-scripts\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.118115 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-config-data\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.118138 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.118339 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.118367 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-logs\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.118386 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bpzj\" (UniqueName: \"kubernetes.io/projected/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-kube-api-access-7bpzj\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.118445 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.121858 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-qp469" podStartSLOduration=3.6774585159999997 podStartE2EDuration="25.121826405s" podCreationTimestamp="2025-11-26 13:50:52 +0000 UTC" firstStartedPulling="2025-11-26 13:50:53.976576967 +0000 UTC m=+1222.655778785" lastFinishedPulling="2025-11-26 13:51:15.420944856 +0000 UTC m=+1244.100146674" observedRunningTime="2025-11-26 13:51:17.108438111 +0000 UTC m=+1245.787639929" watchObservedRunningTime="2025-11-26 13:51:17.121826405 +0000 UTC m=+1245.801028223" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.130756 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-scripts\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.131680 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.134890 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-config-data\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.135511 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-logs\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.139001 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.149996 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.165067 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bpzj\" (UniqueName: \"kubernetes.io/projected/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-kube-api-access-7bpzj\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.188326 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.196797 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.200628 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.204156 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.212058 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.313288 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.325925 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxcdc\" (UniqueName: \"kubernetes.io/projected/e8de09e6-19a7-4ddd-b334-7b61a6efe879-kube-api-access-lxcdc\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.325986 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.326099 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.326223 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.326306 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.326481 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.326506 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.428528 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.428918 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.428938 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.429312 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.429502 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.429522 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.429831 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxcdc\" (UniqueName: \"kubernetes.io/projected/e8de09e6-19a7-4ddd-b334-7b61a6efe879-kube-api-access-lxcdc\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.429865 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.430351 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.433388 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.435565 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.437022 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.449735 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.470913 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxcdc\" (UniqueName: \"kubernetes.io/projected/e8de09e6-19a7-4ddd-b334-7b61a6efe879-kube-api-access-lxcdc\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.473127 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.532776 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:17 crc kubenswrapper[5200]: I1126 13:51:17.974861 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:18 crc kubenswrapper[5200]: I1126 13:51:18.127223 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfe7a06b-ccd8-4ce2-a155-518ae65c3698","Type":"ContainerStarted","Data":"772967c956accc3a58ad21ff4d15c40d8769a290424eb83c891c5cadd18ef46c"} Nov 26 13:51:18 crc kubenswrapper[5200]: I1126 13:51:18.136595 5200 generic.go:358] "Generic (PLEG): container finished" podID="1247eec6-f811-4028-9e60-ee2e95526022" containerID="e1c6eadee1095f9191dc95e8766d873ca66582e5414c9d5c88c857e894141461" exitCode=0 Nov 26 13:51:18 crc kubenswrapper[5200]: I1126 13:51:18.138598 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f494c55f-m592t" event={"ID":"1247eec6-f811-4028-9e60-ee2e95526022","Type":"ContainerDied","Data":"e1c6eadee1095f9191dc95e8766d873ca66582e5414c9d5c88c857e894141461"} Nov 26 13:51:18 crc kubenswrapper[5200]: I1126 13:51:18.229356 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:18 crc kubenswrapper[5200]: I1126 13:51:18.758952 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:18 crc kubenswrapper[5200]: I1126 13:51:18.833648 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:19 crc kubenswrapper[5200]: I1126 13:51:19.160273 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8de09e6-19a7-4ddd-b334-7b61a6efe879","Type":"ContainerStarted","Data":"970529921a051ad40610a30f0c01bc0c91de6f034259f36a34214cf813c8852d"} Nov 26 13:51:19 crc kubenswrapper[5200]: I1126 13:51:19.165154 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g4hpw" event={"ID":"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01","Type":"ContainerStarted","Data":"1034a642fca91021fb4a1bf5b30b2051f4a68e54854ca3b2ef0040c85597f817"} Nov 26 13:51:19 crc kubenswrapper[5200]: I1126 13:51:19.219226 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-g4hpw" podStartSLOduration=5.085026443 podStartE2EDuration="27.219197465s" podCreationTimestamp="2025-11-26 13:50:52 +0000 UTC" firstStartedPulling="2025-11-26 13:50:53.539188692 +0000 UTC m=+1222.218390500" lastFinishedPulling="2025-11-26 13:51:15.673359704 +0000 UTC m=+1244.352561522" observedRunningTime="2025-11-26 13:51:19.189478709 +0000 UTC m=+1247.868680527" watchObservedRunningTime="2025-11-26 13:51:19.219197465 +0000 UTC m=+1247.898399283" Nov 26 13:51:20 crc kubenswrapper[5200]: I1126 13:51:20.199978 5200 generic.go:358] "Generic (PLEG): container finished" podID="bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" containerID="367602e761bf597b8df1b7576ea8c9760abfec581d9ad4e3e0f5abcc608fde62" exitCode=0 Nov 26 13:51:20 crc kubenswrapper[5200]: I1126 13:51:20.200068 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qp469" event={"ID":"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae","Type":"ContainerDied","Data":"367602e761bf597b8df1b7576ea8c9760abfec581d9ad4e3e0f5abcc608fde62"} Nov 26 13:51:20 crc kubenswrapper[5200]: I1126 13:51:20.205208 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8de09e6-19a7-4ddd-b334-7b61a6efe879","Type":"ContainerStarted","Data":"8b0dcbb6dde1ab8a99b1389ac4f42ff9dcc31314d34825da866df889946ee7ab"} Nov 26 13:51:20 crc kubenswrapper[5200]: I1126 13:51:20.237499 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfe7a06b-ccd8-4ce2-a155-518ae65c3698","Type":"ContainerStarted","Data":"642c09929976bf392286138a7151f0a5a906340e3513eb9d22634a8aca9c5be9"} Nov 26 13:51:20 crc kubenswrapper[5200]: I1126 13:51:20.239535 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerStarted","Data":"ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c"} Nov 26 13:51:20 crc kubenswrapper[5200]: I1126 13:51:20.243574 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f494c55f-m592t" event={"ID":"1247eec6-f811-4028-9e60-ee2e95526022","Type":"ContainerStarted","Data":"def7f66910a2c22d2dc2809f86bc8ccad8aa9f0be671fc91e2d65c9c463fa23b"} Nov 26 13:51:20 crc kubenswrapper[5200]: I1126 13:51:20.243649 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:20 crc kubenswrapper[5200]: I1126 13:51:20.268425 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79f494c55f-m592t" podStartSLOduration=5.268400513 podStartE2EDuration="5.268400513s" podCreationTimestamp="2025-11-26 13:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:20.265003964 +0000 UTC m=+1248.944205782" watchObservedRunningTime="2025-11-26 13:51:20.268400513 +0000 UTC m=+1248.947602331" Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.259354 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerName="glance-log" containerID="cri-o://8b0dcbb6dde1ab8a99b1389ac4f42ff9dcc31314d34825da866df889946ee7ab" gracePeriod=30 Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.259374 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8de09e6-19a7-4ddd-b334-7b61a6efe879","Type":"ContainerStarted","Data":"0da0154b8924a2fb69c15af2a20ef9f8134e6a4d2b94ff3b6f30592fd0791b53"} Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.259460 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerName="glance-httpd" containerID="cri-o://0da0154b8924a2fb69c15af2a20ef9f8134e6a4d2b94ff3b6f30592fd0791b53" gracePeriod=30 Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.276176 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfe7a06b-ccd8-4ce2-a155-518ae65c3698","Type":"ContainerStarted","Data":"aadaec79631c2b6c6680b338094e2f8018ec00a9ff5ce09367f06317429937ec"} Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.276400 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerName="glance-log" containerID="cri-o://642c09929976bf392286138a7151f0a5a906340e3513eb9d22634a8aca9c5be9" gracePeriod=30 Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.276948 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerName="glance-httpd" containerID="cri-o://aadaec79631c2b6c6680b338094e2f8018ec00a9ff5ce09367f06317429937ec" gracePeriod=30 Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.297329 5200 generic.go:358] "Generic (PLEG): container finished" podID="8e6b8c32-f5d6-4f06-932a-81c158db7a8c" containerID="415457488fffda18bdfc53b0bac2a90c6c9fe3bafcf3620cbd35fa9928bde6da" exitCode=0 Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.297787 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7qzt" event={"ID":"8e6b8c32-f5d6-4f06-932a-81c158db7a8c","Type":"ContainerDied","Data":"415457488fffda18bdfc53b0bac2a90c6c9fe3bafcf3620cbd35fa9928bde6da"} Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.359837 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.359786768 podStartE2EDuration="5.359786768s" podCreationTimestamp="2025-11-26 13:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:21.284675391 +0000 UTC m=+1249.963877229" watchObservedRunningTime="2025-11-26 13:51:21.359786768 +0000 UTC m=+1250.038988586" Nov 26 13:51:21 crc kubenswrapper[5200]: I1126 13:51:21.385118 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.384659376 podStartE2EDuration="6.384659376s" podCreationTimestamp="2025-11-26 13:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:21.314674275 +0000 UTC m=+1249.993876113" watchObservedRunningTime="2025-11-26 13:51:21.384659376 +0000 UTC m=+1250.063861184" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.152535 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qp469" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.267237 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-config-data\") pod \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.267337 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-combined-ca-bundle\") pod \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.267505 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-logs\") pod \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.267610 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bngn8\" (UniqueName: \"kubernetes.io/projected/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-kube-api-access-bngn8\") pod \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.267826 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-scripts\") pod \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\" (UID: \"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae\") " Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.269293 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-logs" (OuterVolumeSpecName: "logs") pod "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" (UID: "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.275666 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-kube-api-access-bngn8" (OuterVolumeSpecName: "kube-api-access-bngn8") pod "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" (UID: "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae"). InnerVolumeSpecName "kube-api-access-bngn8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.277276 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-scripts" (OuterVolumeSpecName: "scripts") pod "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" (UID: "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.311672 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-config-data" (OuterVolumeSpecName: "config-data") pod "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" (UID: "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.319491 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qp469" event={"ID":"bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae","Type":"ContainerDied","Data":"70e29797f166e2c208f0fb93ef19ab1b56592d635c5ee7b6d3f5d6f3590d8ae6"} Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.319546 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e29797f166e2c208f0fb93ef19ab1b56592d635c5ee7b6d3f5d6f3590d8ae6" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.319619 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qp469" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.321402 5200 generic.go:358] "Generic (PLEG): container finished" podID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerID="8b0dcbb6dde1ab8a99b1389ac4f42ff9dcc31314d34825da866df889946ee7ab" exitCode=143 Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.321457 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8de09e6-19a7-4ddd-b334-7b61a6efe879","Type":"ContainerDied","Data":"8b0dcbb6dde1ab8a99b1389ac4f42ff9dcc31314d34825da866df889946ee7ab"} Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.323217 5200 generic.go:358] "Generic (PLEG): container finished" podID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerID="642c09929976bf392286138a7151f0a5a906340e3513eb9d22634a8aca9c5be9" exitCode=143 Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.323540 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfe7a06b-ccd8-4ce2-a155-518ae65c3698","Type":"ContainerDied","Data":"642c09929976bf392286138a7151f0a5a906340e3513eb9d22634a8aca9c5be9"} Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.325620 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-779774b9bd-gltc4"] Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.326992 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" containerName="placement-db-sync" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.327010 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" containerName="placement-db-sync" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.327200 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" containerName="placement-db-sync" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.333084 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" (UID: "bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.369840 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.369877 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bngn8\" (UniqueName: \"kubernetes.io/projected/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-kube-api-access-bngn8\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.369886 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.369895 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:22 crc kubenswrapper[5200]: I1126 13:51:22.369908 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.341951 5200 generic.go:358] "Generic (PLEG): container finished" podID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerID="aadaec79631c2b6c6680b338094e2f8018ec00a9ff5ce09367f06317429937ec" exitCode=0 Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.347392 5200 generic.go:358] "Generic (PLEG): container finished" podID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerID="0da0154b8924a2fb69c15af2a20ef9f8134e6a4d2b94ff3b6f30592fd0791b53" exitCode=0 Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.352884 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-779774b9bd-gltc4"] Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.353514 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.356560 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-placement-public-svc\"" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.359209 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-placement-internal-svc\"" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.359860 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-scripts\"" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.359966 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-placement-dockercfg-nbggv\"" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.359997 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-config-data\"" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.396332 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfe7a06b-ccd8-4ce2-a155-518ae65c3698","Type":"ContainerDied","Data":"aadaec79631c2b6c6680b338094e2f8018ec00a9ff5ce09367f06317429937ec"} Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.396404 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8de09e6-19a7-4ddd-b334-7b61a6efe879","Type":"ContainerDied","Data":"0da0154b8924a2fb69c15af2a20ef9f8134e6a4d2b94ff3b6f30592fd0791b53"} Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.398341 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-combined-ca-bundle\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.398519 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpvht\" (UniqueName: \"kubernetes.io/projected/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-kube-api-access-bpvht\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.398641 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-logs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.398765 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-config-data\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.398834 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-scripts\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.399086 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-internal-tls-certs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.399172 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-public-tls-certs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.503072 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpvht\" (UniqueName: \"kubernetes.io/projected/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-kube-api-access-bpvht\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.503191 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-logs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.503247 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-config-data\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.503301 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-scripts\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.503385 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-internal-tls-certs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.503432 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-public-tls-certs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.503626 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-combined-ca-bundle\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.511499 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-logs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.515793 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-internal-tls-certs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.517426 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-public-tls-certs\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.517615 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-config-data\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.519941 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-scripts\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.527584 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-combined-ca-bundle\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.533453 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpvht\" (UniqueName: \"kubernetes.io/projected/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-kube-api-access-bpvht\") pod \"placement-779774b9bd-gltc4\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:23 crc kubenswrapper[5200]: I1126 13:51:23.697755 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.326381 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.475314 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65b496c59-vqks6"] Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.475642 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65b496c59-vqks6" podUID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" containerName="dnsmasq-dns" containerID="cri-o://c8b425efed21b324dc0eb3805e5722eb74833f7d41b5a287617a6da6c2411594" gracePeriod=10 Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.639328 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.713653 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-combined-ca-bundle\") pod \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.713793 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-credential-keys\") pod \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.714041 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t27dq\" (UniqueName: \"kubernetes.io/projected/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-kube-api-access-t27dq\") pod \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.714107 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-fernet-keys\") pod \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.714139 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-config-data\") pod \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.714206 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-scripts\") pod \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\" (UID: \"8e6b8c32-f5d6-4f06-932a-81c158db7a8c\") " Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.723221 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-scripts" (OuterVolumeSpecName: "scripts") pod "8e6b8c32-f5d6-4f06-932a-81c158db7a8c" (UID: "8e6b8c32-f5d6-4f06-932a-81c158db7a8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.727732 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8e6b8c32-f5d6-4f06-932a-81c158db7a8c" (UID: "8e6b8c32-f5d6-4f06-932a-81c158db7a8c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.730636 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-kube-api-access-t27dq" (OuterVolumeSpecName: "kube-api-access-t27dq") pod "8e6b8c32-f5d6-4f06-932a-81c158db7a8c" (UID: "8e6b8c32-f5d6-4f06-932a-81c158db7a8c"). InnerVolumeSpecName "kube-api-access-t27dq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.737040 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8e6b8c32-f5d6-4f06-932a-81c158db7a8c" (UID: "8e6b8c32-f5d6-4f06-932a-81c158db7a8c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.776990 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e6b8c32-f5d6-4f06-932a-81c158db7a8c" (UID: "8e6b8c32-f5d6-4f06-932a-81c158db7a8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.791931 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-config-data" (OuterVolumeSpecName: "config-data") pod "8e6b8c32-f5d6-4f06-932a-81c158db7a8c" (UID: "8e6b8c32-f5d6-4f06-932a-81c158db7a8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.817299 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.817344 5200 reconciler_common.go:299] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.817354 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t27dq\" (UniqueName: \"kubernetes.io/projected/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-kube-api-access-t27dq\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.817370 5200 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.817378 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:27 crc kubenswrapper[5200]: I1126 13:51:27.817386 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e6b8c32-f5d6-4f06-932a-81c158db7a8c-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:28 crc kubenswrapper[5200]: W1126 13:51:28.047372 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f6817f6_ce07_4b0f_a6c8_d667faa003a8.slice/crio-57bbfc8deb83169bbcacb2dfd5aaf7443e666a2c68daaad333137eb7c9ecc184 WatchSource:0}: Error finding container 57bbfc8deb83169bbcacb2dfd5aaf7443e666a2c68daaad333137eb7c9ecc184: Status 404 returned error can't find the container with id 57bbfc8deb83169bbcacb2dfd5aaf7443e666a2c68daaad333137eb7c9ecc184 Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.047826 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-779774b9bd-gltc4"] Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.454830 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7qzt" event={"ID":"8e6b8c32-f5d6-4f06-932a-81c158db7a8c","Type":"ContainerDied","Data":"cee942d036fd563f8def2182d2a82c6296e5c5827fd535493420c50df4766966"} Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.454887 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cee942d036fd563f8def2182d2a82c6296e5c5827fd535493420c50df4766966" Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.455014 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7qzt" Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.460941 5200 generic.go:358] "Generic (PLEG): container finished" podID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" containerID="c8b425efed21b324dc0eb3805e5722eb74833f7d41b5a287617a6da6c2411594" exitCode=0 Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.461136 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b496c59-vqks6" event={"ID":"4b8f8610-54ec-4b91-b84e-3b9d424ecc17","Type":"ContainerDied","Data":"c8b425efed21b324dc0eb3805e5722eb74833f7d41b5a287617a6da6c2411594"} Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.462907 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-779774b9bd-gltc4" event={"ID":"4f6817f6-ce07-4b0f-a6c8-d667faa003a8","Type":"ContainerStarted","Data":"57bbfc8deb83169bbcacb2dfd5aaf7443e666a2c68daaad333137eb7c9ecc184"} Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.829519 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-557444d9fd-t9lq7"] Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.831456 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e6b8c32-f5d6-4f06-932a-81c158db7a8c" containerName="keystone-bootstrap" Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.831542 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6b8c32-f5d6-4f06-932a-81c158db7a8c" containerName="keystone-bootstrap" Nov 26 13:51:28 crc kubenswrapper[5200]: I1126 13:51:28.831873 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e6b8c32-f5d6-4f06-932a-81c158db7a8c" containerName="keystone-bootstrap" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.097235 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-557444d9fd-t9lq7"] Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.100393 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.102843 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.105193 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.105207 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-keystone-public-svc\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.107343 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-keystone-internal-svc\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.116361 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-xzmjl\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.138961 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.164566 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.177959 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248257 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmq2\" (UniqueName: \"kubernetes.io/projected/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-kube-api-access-rgmq2\") pod \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248321 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-scripts\") pod \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248450 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-httpd-run\") pod \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248471 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-svc\") pod \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248597 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxcdc\" (UniqueName: \"kubernetes.io/projected/e8de09e6-19a7-4ddd-b334-7b61a6efe879-kube-api-access-lxcdc\") pod \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248697 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-config\") pod \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248759 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-logs\") pod \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248801 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-combined-ca-bundle\") pod \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248904 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-config-data\") pod \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.248979 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-nb\") pod \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249005 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-swift-storage-0\") pod \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249039 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\" (UID: \"e8de09e6-19a7-4ddd-b334-7b61a6efe879\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249059 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-sb\") pod \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\" (UID: \"4b8f8610-54ec-4b91-b84e-3b9d424ecc17\") " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249332 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-credential-keys\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249388 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-combined-ca-bundle\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249430 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-scripts\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249479 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-fernet-keys\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249551 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-internal-tls-certs\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249621 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4n7m\" (UniqueName: \"kubernetes.io/projected/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-kube-api-access-p4n7m\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249655 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-public-tls-certs\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.249709 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-config-data\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.253183 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8de09e6-19a7-4ddd-b334-7b61a6efe879" (UID: "e8de09e6-19a7-4ddd-b334-7b61a6efe879"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.267604 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-logs" (OuterVolumeSpecName: "logs") pod "e8de09e6-19a7-4ddd-b334-7b61a6efe879" (UID: "e8de09e6-19a7-4ddd-b334-7b61a6efe879"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.288015 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8de09e6-19a7-4ddd-b334-7b61a6efe879-kube-api-access-lxcdc" (OuterVolumeSpecName: "kube-api-access-lxcdc") pod "e8de09e6-19a7-4ddd-b334-7b61a6efe879" (UID: "e8de09e6-19a7-4ddd-b334-7b61a6efe879"). InnerVolumeSpecName "kube-api-access-lxcdc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.290926 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-kube-api-access-rgmq2" (OuterVolumeSpecName: "kube-api-access-rgmq2") pod "4b8f8610-54ec-4b91-b84e-3b9d424ecc17" (UID: "4b8f8610-54ec-4b91-b84e-3b9d424ecc17"). InnerVolumeSpecName "kube-api-access-rgmq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.304937 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e8de09e6-19a7-4ddd-b334-7b61a6efe879" (UID: "e8de09e6-19a7-4ddd-b334-7b61a6efe879"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.314926 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-scripts" (OuterVolumeSpecName: "scripts") pod "e8de09e6-19a7-4ddd-b334-7b61a6efe879" (UID: "e8de09e6-19a7-4ddd-b334-7b61a6efe879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.390810 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4n7m\" (UniqueName: \"kubernetes.io/projected/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-kube-api-access-p4n7m\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.390883 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-public-tls-certs\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.390916 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-config-data\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.390966 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-credential-keys\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391020 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-combined-ca-bundle\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391061 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-scripts\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391102 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-fernet-keys\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391149 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-internal-tls-certs\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391233 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rgmq2\" (UniqueName: \"kubernetes.io/projected/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-kube-api-access-rgmq2\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391245 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391257 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391266 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lxcdc\" (UniqueName: \"kubernetes.io/projected/e8de09e6-19a7-4ddd-b334-7b61a6efe879-kube-api-access-lxcdc\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391275 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8de09e6-19a7-4ddd-b334-7b61a6efe879-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.391298 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.403089 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-fernet-keys\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.403752 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-config-data\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.408935 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-credential-keys\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.410012 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-internal-tls-certs\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.415168 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-public-tls-certs\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.417281 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-combined-ca-bundle\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.425481 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-scripts\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.442580 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4n7m\" (UniqueName: \"kubernetes.io/projected/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-kube-api-access-p4n7m\") pod \"keystone-557444d9fd-t9lq7\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.454875 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-config" (OuterVolumeSpecName: "config") pod "4b8f8610-54ec-4b91-b84e-3b9d424ecc17" (UID: "4b8f8610-54ec-4b91-b84e-3b9d424ecc17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.455015 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4b8f8610-54ec-4b91-b84e-3b9d424ecc17" (UID: "4b8f8610-54ec-4b91-b84e-3b9d424ecc17"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.474921 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.483777 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b8f8610-54ec-4b91-b84e-3b9d424ecc17" (UID: "4b8f8610-54ec-4b91-b84e-3b9d424ecc17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.497777 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.497972 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.497988 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.500383 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b496c59-vqks6" event={"ID":"4b8f8610-54ec-4b91-b84e-3b9d424ecc17","Type":"ContainerDied","Data":"1e0f1d3f726d70b542392dfd84a752716e0a9e65f819dda0002355cae0f034d9"} Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.500474 5200 scope.go:117] "RemoveContainer" containerID="c8b425efed21b324dc0eb3805e5722eb74833f7d41b5a287617a6da6c2411594" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.500694 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65b496c59-vqks6" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.510297 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b8f8610-54ec-4b91-b84e-3b9d424ecc17" (UID: "4b8f8610-54ec-4b91-b84e-3b9d424ecc17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.514784 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-config-data" (OuterVolumeSpecName: "config-data") pod "e8de09e6-19a7-4ddd-b334-7b61a6efe879" (UID: "e8de09e6-19a7-4ddd-b334-7b61a6efe879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.529076 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-779774b9bd-gltc4" event={"ID":"4f6817f6-ce07-4b0f-a6c8-d667faa003a8","Type":"ContainerStarted","Data":"01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e"} Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.533808 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8de09e6-19a7-4ddd-b334-7b61a6efe879","Type":"ContainerDied","Data":"970529921a051ad40610a30f0c01bc0c91de6f034259f36a34214cf813c8852d"} Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.533929 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.547892 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8de09e6-19a7-4ddd-b334-7b61a6efe879" (UID: "e8de09e6-19a7-4ddd-b334-7b61a6efe879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.554569 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.599537 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.599589 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.599600 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.599612 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de09e6-19a7-4ddd-b334-7b61a6efe879-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.614782 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b8f8610-54ec-4b91-b84e-3b9d424ecc17" (UID: "4b8f8610-54ec-4b91-b84e-3b9d424ecc17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.702052 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b8f8610-54ec-4b91-b84e-3b9d424ecc17-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.850411 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65b496c59-vqks6"] Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.859591 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65b496c59-vqks6"] Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.888416 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.919915 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.956728 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.965660 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" containerName="dnsmasq-dns" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.965725 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" containerName="dnsmasq-dns" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.965774 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerName="glance-log" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.965782 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerName="glance-log" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.965807 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" containerName="init" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.965814 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" containerName="init" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.965887 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerName="glance-httpd" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.965898 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerName="glance-httpd" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.966484 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerName="glance-httpd" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.966504 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" containerName="dnsmasq-dns" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.966530 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" containerName="glance-log" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.977292 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.977404 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.982121 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-internal-svc\"" Nov 26 13:51:29 crc kubenswrapper[5200]: I1126 13:51:29.982279 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.120108 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfn8z\" (UniqueName: \"kubernetes.io/projected/baadd4af-f958-45eb-ae2f-eb5ecead58a7-kube-api-access-mfn8z\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.120243 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.120292 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.120379 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.120482 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.120543 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.120571 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.120598 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223049 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223133 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223194 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223274 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223309 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223331 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223360 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223403 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfn8z\" (UniqueName: \"kubernetes.io/projected/baadd4af-f958-45eb-ae2f-eb5ecead58a7-kube-api-access-mfn8z\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.223474 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.224249 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.224536 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.232496 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.232790 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.233849 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.234503 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.251489 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfn8z\" (UniqueName: \"kubernetes.io/projected/baadd4af-f958-45eb-ae2f-eb5ecead58a7-kube-api-access-mfn8z\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.264023 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.317177 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.550465 5200 generic.go:358] "Generic (PLEG): container finished" podID="e1df6fce-e471-466b-b8fb-e21823bb3e3f" containerID="9bc8f43279d7dbd4a5978975de7b27caa4051fc1687a7d4d38d84f361032231d" exitCode=0 Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.550577 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pzmsj" event={"ID":"e1df6fce-e471-466b-b8fb-e21823bb3e3f","Type":"ContainerDied","Data":"9bc8f43279d7dbd4a5978975de7b27caa4051fc1687a7d4d38d84f361032231d"} Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.981992 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b8f8610-54ec-4b91-b84e-3b9d424ecc17" path="/var/lib/kubelet/pods/4b8f8610-54ec-4b91-b84e-3b9d424ecc17/volumes" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.986285 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8de09e6-19a7-4ddd-b334-7b61a6efe879" path="/var/lib/kubelet/pods/e8de09e6-19a7-4ddd-b334-7b61a6efe879/volumes" Nov 26 13:51:30 crc kubenswrapper[5200]: I1126 13:51:30.993491 5200 scope.go:117] "RemoveContainer" containerID="3081debc234d9bb17fae81e348b82c22922224e36ab98e815e0aee6a0bc6b83f" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.122429 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.148599 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.148671 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-config-data\") pod \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.148772 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bpzj\" (UniqueName: \"kubernetes.io/projected/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-kube-api-access-7bpzj\") pod \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.148800 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-logs\") pod \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.148840 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-scripts\") pod \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.148874 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-httpd-run\") pod \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.149009 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-combined-ca-bundle\") pod \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\" (UID: \"cfe7a06b-ccd8-4ce2-a155-518ae65c3698\") " Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.151498 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-logs" (OuterVolumeSpecName: "logs") pod "cfe7a06b-ccd8-4ce2-a155-518ae65c3698" (UID: "cfe7a06b-ccd8-4ce2-a155-518ae65c3698"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.151876 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cfe7a06b-ccd8-4ce2-a155-518ae65c3698" (UID: "cfe7a06b-ccd8-4ce2-a155-518ae65c3698"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.175014 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-scripts" (OuterVolumeSpecName: "scripts") pod "cfe7a06b-ccd8-4ce2-a155-518ae65c3698" (UID: "cfe7a06b-ccd8-4ce2-a155-518ae65c3698"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.175680 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-kube-api-access-7bpzj" (OuterVolumeSpecName: "kube-api-access-7bpzj") pod "cfe7a06b-ccd8-4ce2-a155-518ae65c3698" (UID: "cfe7a06b-ccd8-4ce2-a155-518ae65c3698"). InnerVolumeSpecName "kube-api-access-7bpzj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.176789 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "cfe7a06b-ccd8-4ce2-a155-518ae65c3698" (UID: "cfe7a06b-ccd8-4ce2-a155-518ae65c3698"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.203061 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfe7a06b-ccd8-4ce2-a155-518ae65c3698" (UID: "cfe7a06b-ccd8-4ce2-a155-518ae65c3698"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.219820 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-config-data" (OuterVolumeSpecName: "config-data") pod "cfe7a06b-ccd8-4ce2-a155-518ae65c3698" (UID: "cfe7a06b-ccd8-4ce2-a155-518ae65c3698"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.250601 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.250850 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.250966 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.251056 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7bpzj\" (UniqueName: \"kubernetes.io/projected/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-kube-api-access-7bpzj\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.251144 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.251204 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.251267 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7a06b-ccd8-4ce2-a155-518ae65c3698-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.271455 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.352433 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.567077 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.567860 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfe7a06b-ccd8-4ce2-a155-518ae65c3698","Type":"ContainerDied","Data":"772967c956accc3a58ad21ff4d15c40d8769a290424eb83c891c5cadd18ef46c"} Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.612472 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.622239 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.642233 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.644186 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerName="glance-log" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.644212 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerName="glance-log" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.644260 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerName="glance-httpd" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.644270 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerName="glance-httpd" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.644594 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerName="glance-log" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.644617 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" containerName="glance-httpd" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.650912 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.654187 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-public-svc\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.654517 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.665699 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.759945 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4ljq\" (UniqueName: \"kubernetes.io/projected/29258dfe-0805-481f-ad5f-a133bd88517e-kube-api-access-z4ljq\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.760021 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.760149 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-logs\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.760246 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.760669 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-config-data\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.760764 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.760872 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-scripts\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.761236 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.862920 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-config-data\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.862972 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.862992 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-scripts\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.863059 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.863090 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4ljq\" (UniqueName: \"kubernetes.io/projected/29258dfe-0805-481f-ad5f-a133bd88517e-kube-api-access-z4ljq\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.863137 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.863158 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-logs\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.863182 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.865813 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.866308 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.868660 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-logs\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.874018 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.874100 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-config-data\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.874563 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-scripts\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.875922 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.885476 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4ljq\" (UniqueName: \"kubernetes.io/projected/29258dfe-0805-481f-ad5f-a133bd88517e-kube-api-access-z4ljq\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:31 crc kubenswrapper[5200]: I1126 13:51:31.901900 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " pod="openstack/glance-default-external-api-0" Nov 26 13:51:32 crc kubenswrapper[5200]: I1126 13:51:32.019304 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:51:32 crc kubenswrapper[5200]: I1126 13:51:32.992551 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe7a06b-ccd8-4ce2-a155-518ae65c3698" path="/var/lib/kubelet/pods/cfe7a06b-ccd8-4ce2-a155-518ae65c3698/volumes" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.065472 5200 scope.go:117] "RemoveContainer" containerID="0da0154b8924a2fb69c15af2a20ef9f8134e6a4d2b94ff3b6f30592fd0791b53" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.258551 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.275944 5200 scope.go:117] "RemoveContainer" containerID="8b0dcbb6dde1ab8a99b1389ac4f42ff9dcc31314d34825da866df889946ee7ab" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.324993 5200 scope.go:117] "RemoveContainer" containerID="aadaec79631c2b6c6680b338094e2f8018ec00a9ff5ce09367f06317429937ec" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.396351 5200 scope.go:117] "RemoveContainer" containerID="642c09929976bf392286138a7151f0a5a906340e3513eb9d22634a8aca9c5be9" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.404272 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-combined-ca-bundle\") pod \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.404325 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-db-sync-config-data\") pod \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.404475 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqpdm\" (UniqueName: \"kubernetes.io/projected/e1df6fce-e471-466b-b8fb-e21823bb3e3f-kube-api-access-cqpdm\") pod \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\" (UID: \"e1df6fce-e471-466b-b8fb-e21823bb3e3f\") " Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.413956 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e1df6fce-e471-466b-b8fb-e21823bb3e3f" (UID: "e1df6fce-e471-466b-b8fb-e21823bb3e3f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.414881 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1df6fce-e471-466b-b8fb-e21823bb3e3f-kube-api-access-cqpdm" (OuterVolumeSpecName: "kube-api-access-cqpdm") pod "e1df6fce-e471-466b-b8fb-e21823bb3e3f" (UID: "e1df6fce-e471-466b-b8fb-e21823bb3e3f"). InnerVolumeSpecName "kube-api-access-cqpdm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.448508 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1df6fce-e471-466b-b8fb-e21823bb3e3f" (UID: "e1df6fce-e471-466b-b8fb-e21823bb3e3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.506611 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.506955 5200 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1df6fce-e471-466b-b8fb-e21823bb3e3f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.507024 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqpdm\" (UniqueName: \"kubernetes.io/projected/e1df6fce-e471-466b-b8fb-e21823bb3e3f-kube-api-access-cqpdm\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.584926 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.593678 5200 generic.go:358] "Generic (PLEG): container finished" podID="71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" containerID="1034a642fca91021fb4a1bf5b30b2051f4a68e54854ca3b2ef0040c85597f817" exitCode=0 Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.593903 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g4hpw" event={"ID":"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01","Type":"ContainerDied","Data":"1034a642fca91021fb4a1bf5b30b2051f4a68e54854ca3b2ef0040c85597f817"} Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.597923 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerStarted","Data":"a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911"} Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.604055 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-779774b9bd-gltc4" event={"ID":"4f6817f6-ce07-4b0f-a6c8-d667faa003a8","Type":"ContainerStarted","Data":"4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce"} Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.604549 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.605037 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.617174 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pzmsj" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.617185 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pzmsj" event={"ID":"e1df6fce-e471-466b-b8fb-e21823bb3e3f","Type":"ContainerDied","Data":"eb3e3f647ece2549bd4f736c65d15f6f12e67739053c31aefd0ceebc6e28cee0"} Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.617252 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb3e3f647ece2549bd4f736c65d15f6f12e67739053c31aefd0ceebc6e28cee0" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.636635 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-557444d9fd-t9lq7"] Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.660218 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-779774b9bd-gltc4" podStartSLOduration=11.660192289 podStartE2EDuration="11.660192289s" podCreationTimestamp="2025-11-26 13:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:33.644547995 +0000 UTC m=+1262.323749833" watchObservedRunningTime="2025-11-26 13:51:33.660192289 +0000 UTC m=+1262.339394107" Nov 26 13:51:33 crc kubenswrapper[5200]: I1126 13:51:33.809161 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:51:33 crc kubenswrapper[5200]: W1126 13:51:33.819802 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29258dfe_0805_481f_ad5f_a133bd88517e.slice/crio-deca4b24c051fb0576f663ee2c2e7bf8369138e851edcd1b8f3ffffb553091c8 WatchSource:0}: Error finding container deca4b24c051fb0576f663ee2c2e7bf8369138e851edcd1b8f3ffffb553091c8: Status 404 returned error can't find the container with id deca4b24c051fb0576f663ee2c2e7bf8369138e851edcd1b8f3ffffb553091c8 Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.619071 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7544499f5f-8pt52"] Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.621040 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1df6fce-e471-466b-b8fb-e21823bb3e3f" containerName="barbican-db-sync" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.621069 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1df6fce-e471-466b-b8fb-e21823bb3e3f" containerName="barbican-db-sync" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.621322 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1df6fce-e471-466b-b8fb-e21823bb3e3f" containerName="barbican-db-sync" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.640742 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.655795 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-55d547fcd6-blkmf"] Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.656201 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-worker-config-data\"" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.656483 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-barbican-dockercfg-d26sf\"" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.656669 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-config-data\"" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.678575 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.687073 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7544499f5f-8pt52"] Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.688623 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-keystone-listener-config-data\"" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.727524 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-557444d9fd-t9lq7" event={"ID":"11e2f1eb-442a-4f5c-845c-1ecf10744fd1","Type":"ContainerStarted","Data":"38cdc95cb224169d9fc8e6fa2db145ec76be63b846d7793abc56e1bd23843e4c"} Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.727577 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-557444d9fd-t9lq7" event={"ID":"11e2f1eb-442a-4f5c-845c-1ecf10744fd1","Type":"ContainerStarted","Data":"0951a95ec5b27dd8b9357ffa17683cda7d8b9917387fd66d3cfb78e3ab66b742"} Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.727641 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.748263 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29258dfe-0805-481f-ad5f-a133bd88517e","Type":"ContainerStarted","Data":"deca4b24c051fb0576f663ee2c2e7bf8369138e851edcd1b8f3ffffb553091c8"} Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.748593 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22xrq\" (UniqueName: \"kubernetes.io/projected/0a85e23d-55dd-4d11-b304-aeb59804f47d-kube-api-access-22xrq\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751318 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-combined-ca-bundle\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751435 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751467 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data-custom\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751527 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnwxq\" (UniqueName: \"kubernetes.io/projected/5dd57521-dfcc-4303-9f37-3485c7407851-kube-api-access-gnwxq\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751560 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data-custom\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751678 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-combined-ca-bundle\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751841 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd57521-dfcc-4303-9f37-3485c7407851-logs\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751931 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a85e23d-55dd-4d11-b304-aeb59804f47d-logs\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.751988 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.761079 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"baadd4af-f958-45eb-ae2f-eb5ecead58a7","Type":"ContainerStarted","Data":"0a25d3f0874f9fafd0dd6fc7dee32b46e0f09a9d25839443eb7f11f641f50a5b"} Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.761132 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"baadd4af-f958-45eb-ae2f-eb5ecead58a7","Type":"ContainerStarted","Data":"32ee7c0e1f52bf2f26996fda9dc185111c4c04dc761d14107ab92f62f996ed02"} Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.773151 5200 generic.go:358] "Generic (PLEG): container finished" podID="4ce2de04-4f6b-49fa-9946-5ef9663e95e8" containerID="76f5da8064003e6153826b45a00197cff1cb02daddb956c84a81e5f15f4da5d8" exitCode=0 Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.773842 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vk528" event={"ID":"4ce2de04-4f6b-49fa-9946-5ef9663e95e8","Type":"ContainerDied","Data":"76f5da8064003e6153826b45a00197cff1cb02daddb956c84a81e5f15f4da5d8"} Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.786719 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-55d547fcd6-blkmf"] Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.816563 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74654ff48c-9vkqv"] Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.865350 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd57521-dfcc-4303-9f37-3485c7407851-logs\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.866517 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd57521-dfcc-4303-9f37-3485c7407851-logs\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.866909 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a85e23d-55dd-4d11-b304-aeb59804f47d-logs\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.866984 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.867213 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a85e23d-55dd-4d11-b304-aeb59804f47d-logs\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.867267 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22xrq\" (UniqueName: \"kubernetes.io/projected/0a85e23d-55dd-4d11-b304-aeb59804f47d-kube-api-access-22xrq\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.867352 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-combined-ca-bundle\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.867438 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.867471 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data-custom\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.867540 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnwxq\" (UniqueName: \"kubernetes.io/projected/5dd57521-dfcc-4303-9f37-3485c7407851-kube-api-access-gnwxq\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.867578 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data-custom\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.867711 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-combined-ca-bundle\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.872749 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.875637 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74654ff48c-9vkqv"] Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.877457 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.880837 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data-custom\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.884252 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-combined-ca-bundle\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.884792 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data-custom\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.899677 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-combined-ca-bundle\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.899985 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.907144 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22xrq\" (UniqueName: \"kubernetes.io/projected/0a85e23d-55dd-4d11-b304-aeb59804f47d-kube-api-access-22xrq\") pod \"barbican-worker-7544499f5f-8pt52\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.912382 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-557444d9fd-t9lq7" podStartSLOduration=6.912364027 podStartE2EDuration="6.912364027s" podCreationTimestamp="2025-11-26 13:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:34.754241903 +0000 UTC m=+1263.433443721" watchObservedRunningTime="2025-11-26 13:51:34.912364027 +0000 UTC m=+1263.591565845" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.929351 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7dcbdf5458-6txbv"] Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.935271 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnwxq\" (UniqueName: \"kubernetes.io/projected/5dd57521-dfcc-4303-9f37-3485c7407851-kube-api-access-gnwxq\") pod \"barbican-keystone-listener-55d547fcd6-blkmf\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.951548 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dcbdf5458-6txbv"] Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.951845 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.958289 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-api-config-data\"" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.973433 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-logs\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978184 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-config\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978247 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-svc\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978352 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data-custom\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978426 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978475 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-swift-storage-0\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978606 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-sb\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978635 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-combined-ca-bundle\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978655 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-nb\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.978722 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjgcm\" (UniqueName: \"kubernetes.io/projected/11f3ae28-91c1-48d2-a885-f08d58e3a272-kube-api-access-zjgcm\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.979449 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzh8j\" (UniqueName: \"kubernetes.io/projected/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-kube-api-access-vzh8j\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:34 crc kubenswrapper[5200]: I1126 13:51:34.990261 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.044037 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081653 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data-custom\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081729 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081761 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-swift-storage-0\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081817 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-sb\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081843 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-combined-ca-bundle\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081860 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-nb\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081883 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjgcm\" (UniqueName: \"kubernetes.io/projected/11f3ae28-91c1-48d2-a885-f08d58e3a272-kube-api-access-zjgcm\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081920 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzh8j\" (UniqueName: \"kubernetes.io/projected/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-kube-api-access-vzh8j\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.081984 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-logs\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.082054 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-config\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.082084 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-svc\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.084536 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-nb\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.084765 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-logs\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.093454 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-svc\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.100288 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.100781 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data-custom\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.101054 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-swift-storage-0\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.101256 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-config\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.101796 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-sb\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.108136 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjgcm\" (UniqueName: \"kubernetes.io/projected/11f3ae28-91c1-48d2-a885-f08d58e3a272-kube-api-access-zjgcm\") pod \"dnsmasq-dns-74654ff48c-9vkqv\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.112266 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-combined-ca-bundle\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.114937 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzh8j\" (UniqueName: \"kubernetes.io/projected/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-kube-api-access-vzh8j\") pod \"barbican-api-7dcbdf5458-6txbv\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.216122 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.281233 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.286508 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.388190 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-db-sync-config-data\") pod \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.388272 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-etc-machine-id\") pod \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.388365 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" (UID: "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.388492 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-config-data\") pod \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.388590 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx6pd\" (UniqueName: \"kubernetes.io/projected/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-kube-api-access-lx6pd\") pod \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.388775 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-combined-ca-bundle\") pod \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.388967 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-scripts\") pod \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\" (UID: \"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01\") " Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.390151 5200 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.403248 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-scripts" (OuterVolumeSpecName: "scripts") pod "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" (UID: "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.403480 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" (UID: "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.405967 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-kube-api-access-lx6pd" (OuterVolumeSpecName: "kube-api-access-lx6pd") pod "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" (UID: "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01"). InnerVolumeSpecName "kube-api-access-lx6pd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.462620 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" (UID: "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.473341 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-config-data" (OuterVolumeSpecName: "config-data") pod "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" (UID: "71a5e8b8-15b7-4f2b-a60f-3c28341cbe01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.492733 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.492784 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.492799 5200 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.492810 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.492823 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lx6pd\" (UniqueName: \"kubernetes.io/projected/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01-kube-api-access-lx6pd\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:35 crc kubenswrapper[5200]: W1126 13:51:35.730702 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a85e23d_55dd_4d11_b304_aeb59804f47d.slice/crio-837dc8f816442390ac263aa5e594d57578d80c828712f35b3d6e908392d57d69 WatchSource:0}: Error finding container 837dc8f816442390ac263aa5e594d57578d80c828712f35b3d6e908392d57d69: Status 404 returned error can't find the container with id 837dc8f816442390ac263aa5e594d57578d80c828712f35b3d6e908392d57d69 Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.782096 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7544499f5f-8pt52"] Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.801157 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-55d547fcd6-blkmf"] Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.804611 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29258dfe-0805-481f-ad5f-a133bd88517e","Type":"ContainerStarted","Data":"083601691bb0bde35bfac41a3c731f77bedc5b235a717cdf92b5a317fed27f1a"} Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.807704 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7544499f5f-8pt52" event={"ID":"0a85e23d-55dd-4d11-b304-aeb59804f47d","Type":"ContainerStarted","Data":"837dc8f816442390ac263aa5e594d57578d80c828712f35b3d6e908392d57d69"} Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.818767 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"baadd4af-f958-45eb-ae2f-eb5ecead58a7","Type":"ContainerStarted","Data":"57bd9432cfc7e5ffb3b2f1f725e50337c201944249329a6b2edaf1076f1810ab"} Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.836285 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g4hpw" event={"ID":"71a5e8b8-15b7-4f2b-a60f-3c28341cbe01","Type":"ContainerDied","Data":"4c05765318b302e9083463f6cc5c9c8f10879179cd6a45826b61ade40cad74b0"} Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.836346 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c05765318b302e9083463f6cc5c9c8f10879179cd6a45826b61ade40cad74b0" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.836455 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g4hpw" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.858973 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.858938791 podStartE2EDuration="6.858938791s" podCreationTimestamp="2025-11-26 13:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:35.851465733 +0000 UTC m=+1264.530667551" watchObservedRunningTime="2025-11-26 13:51:35.858938791 +0000 UTC m=+1264.538140629" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.873733 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" event={"ID":"5dd57521-dfcc-4303-9f37-3485c7407851","Type":"ContainerStarted","Data":"037187032bb8ed0fb52dcce979cf8569f7b3faa56b155c563a3e7543d2392b63"} Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.907091 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.918381 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" containerName="cinder-db-sync" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.918432 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" containerName="cinder-db-sync" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.918681 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" containerName="cinder-db-sync" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.940767 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.940928 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.944718 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scheduler-config-data\"" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.953615 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-config-data\"" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.953904 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-cinder-dockercfg-xx5md\"" Nov 26 13:51:35 crc kubenswrapper[5200]: I1126 13:51:35.960335 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scripts\"" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.001869 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.001933 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz95j\" (UniqueName: \"kubernetes.io/projected/51b758c5-ab01-4e10-98dd-b668f6f64d4a-kube-api-access-wz95j\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.001972 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-scripts\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.002042 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51b758c5-ab01-4e10-98dd-b668f6f64d4a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.002186 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.002208 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.024027 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74654ff48c-9vkqv"] Nov 26 13:51:36 crc kubenswrapper[5200]: W1126 13:51:36.034666 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11f3ae28_91c1_48d2_a885_f08d58e3a272.slice/crio-369a3823832e31ed171c207e313b9ca5a180c8c28f22b0c4f10b0e70f3ff41e5 WatchSource:0}: Error finding container 369a3823832e31ed171c207e313b9ca5a180c8c28f22b0c4f10b0e70f3ff41e5: Status 404 returned error can't find the container with id 369a3823832e31ed171c207e313b9ca5a180c8c28f22b0c4f10b0e70f3ff41e5 Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.089019 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74654ff48c-9vkqv"] Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.104694 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51b758c5-ab01-4e10-98dd-b668f6f64d4a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.104777 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.104798 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.104850 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.104890 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz95j\" (UniqueName: \"kubernetes.io/projected/51b758c5-ab01-4e10-98dd-b668f6f64d4a-kube-api-access-wz95j\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.104927 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-scripts\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.109651 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51b758c5-ab01-4e10-98dd-b668f6f64d4a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.110648 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-scripts\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.125965 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.133322 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.137551 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz95j\" (UniqueName: \"kubernetes.io/projected/51b758c5-ab01-4e10-98dd-b668f6f64d4a-kube-api-access-wz95j\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.156293 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.172172 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d749f55c5-g5s9f"] Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.229254 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.239177 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d749f55c5-g5s9f"] Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.291594 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.297420 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.304672 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.310247 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6m4\" (UniqueName: \"kubernetes.io/projected/6906c5bb-d665-40c3-a47b-cb8d169764c3-kube-api-access-rs6m4\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.310290 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-swift-storage-0\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.310331 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-svc\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.310367 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-sb\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.310410 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-config\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.310455 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-nb\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.319103 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-api-config-data\"" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.351289 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.419895 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljw4\" (UniqueName: \"kubernetes.io/projected/f219e39d-b7f5-42db-800b-84a40c27eb82-kube-api-access-nljw4\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.420589 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-svc\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.420714 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data-custom\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.420805 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-sb\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.421243 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-config\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.421638 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f219e39d-b7f5-42db-800b-84a40c27eb82-logs\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.421728 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-scripts\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.421807 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.421849 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-nb\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.422530 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-svc\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.422816 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f219e39d-b7f5-42db-800b-84a40c27eb82-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.422947 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6m4\" (UniqueName: \"kubernetes.io/projected/6906c5bb-d665-40c3-a47b-cb8d169764c3-kube-api-access-rs6m4\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.423584 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-swift-storage-0\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.433781 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.433634 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-swift-storage-0\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.435464 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-sb\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.436178 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-config\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.436189 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-nb\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.439226 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dcbdf5458-6txbv"] Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.448208 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6m4\" (UniqueName: \"kubernetes.io/projected/6906c5bb-d665-40c3-a47b-cb8d169764c3-kube-api-access-rs6m4\") pod \"dnsmasq-dns-6d749f55c5-g5s9f\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.542908 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f219e39d-b7f5-42db-800b-84a40c27eb82-logs\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.543322 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-scripts\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.543346 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.543427 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f219e39d-b7f5-42db-800b-84a40c27eb82-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.543475 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.543498 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nljw4\" (UniqueName: \"kubernetes.io/projected/f219e39d-b7f5-42db-800b-84a40c27eb82-kube-api-access-nljw4\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.543540 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data-custom\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.544506 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f219e39d-b7f5-42db-800b-84a40c27eb82-logs\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.544622 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f219e39d-b7f5-42db-800b-84a40c27eb82-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.551424 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.551669 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.554494 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data-custom\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.554569 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-scripts\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.566097 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljw4\" (UniqueName: \"kubernetes.io/projected/f219e39d-b7f5-42db-800b-84a40c27eb82-kube-api-access-nljw4\") pod \"cinder-api-0\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.575803 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vk528" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.647524 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt788\" (UniqueName: \"kubernetes.io/projected/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-kube-api-access-rt788\") pod \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.648070 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-config\") pod \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.648241 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-combined-ca-bundle\") pod \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\" (UID: \"4ce2de04-4f6b-49fa-9946-5ef9663e95e8\") " Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.648880 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.665828 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-kube-api-access-rt788" (OuterVolumeSpecName: "kube-api-access-rt788") pod "4ce2de04-4f6b-49fa-9946-5ef9663e95e8" (UID: "4ce2de04-4f6b-49fa-9946-5ef9663e95e8"). InnerVolumeSpecName "kube-api-access-rt788". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.666424 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.691835 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ce2de04-4f6b-49fa-9946-5ef9663e95e8" (UID: "4ce2de04-4f6b-49fa-9946-5ef9663e95e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.715943 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-config" (OuterVolumeSpecName: "config") pod "4ce2de04-4f6b-49fa-9946-5ef9663e95e8" (UID: "4ce2de04-4f6b-49fa-9946-5ef9663e95e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.750668 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.753039 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rt788\" (UniqueName: \"kubernetes.io/projected/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-kube-api-access-rt788\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.753149 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ce2de04-4f6b-49fa-9946-5ef9663e95e8-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.820212 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.918671 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcbdf5458-6txbv" event={"ID":"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392","Type":"ContainerStarted","Data":"623320551b9ec677d7c7026f6bc7b35f9e5ebb9cfb04aaf275e60244c0b0ecd1"} Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.919247 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcbdf5458-6txbv" event={"ID":"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392","Type":"ContainerStarted","Data":"e4ee3eb159d2775c7e423a56ef95b5613486eb92814f5f89bdb132cc52a411d5"} Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.940778 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.945299 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29258dfe-0805-481f-ad5f-a133bd88517e","Type":"ContainerStarted","Data":"d07b1c117ffd009ab064a77d5433bc202a99c4a7faeb7d1eb6420d58dd9b017f"} Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.964971 5200 generic.go:358] "Generic (PLEG): container finished" podID="11f3ae28-91c1-48d2-a885-f08d58e3a272" containerID="6d3d4b43b75f7bb17052000ff85de1f6ea2ef261540c85c633cead1061f1e530" exitCode=0 Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.965079 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" event={"ID":"11f3ae28-91c1-48d2-a885-f08d58e3a272","Type":"ContainerDied","Data":"6d3d4b43b75f7bb17052000ff85de1f6ea2ef261540c85c633cead1061f1e530"} Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.965207 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" event={"ID":"11f3ae28-91c1-48d2-a885-f08d58e3a272","Type":"ContainerStarted","Data":"369a3823832e31ed171c207e313b9ca5a180c8c28f22b0c4f10b0e70f3ff41e5"} Nov 26 13:51:36 crc kubenswrapper[5200]: I1126 13:51:36.985484 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vk528" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.055122 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.055091577 podStartE2EDuration="6.055091577s" podCreationTimestamp="2025-11-26 13:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:36.979260541 +0000 UTC m=+1265.658462369" watchObservedRunningTime="2025-11-26 13:51:37.055091577 +0000 UTC m=+1265.734293395" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.118738 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vk528" event={"ID":"4ce2de04-4f6b-49fa-9946-5ef9663e95e8","Type":"ContainerDied","Data":"d687e5b744ef8cb6797d83388d34a99606f25e5d74a192459b1352276f2fefcb"} Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.118804 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d687e5b744ef8cb6797d83388d34a99606f25e5d74a192459b1352276f2fefcb" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.129676 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d749f55c5-g5s9f"] Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.146238 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54db795db7-2zrdk"] Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.148071 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ce2de04-4f6b-49fa-9946-5ef9663e95e8" containerName="neutron-db-sync" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.148100 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce2de04-4f6b-49fa-9946-5ef9663e95e8" containerName="neutron-db-sync" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.148358 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ce2de04-4f6b-49fa-9946-5ef9663e95e8" containerName="neutron-db-sync" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.155551 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.266355 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-swift-storage-0\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.266929 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-nb\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.266969 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-sb\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.267020 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-config\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.267068 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-svc\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.267116 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j6qf\" (UniqueName: \"kubernetes.io/projected/13b63abc-53cc-433b-8635-2cc87fed8052-kube-api-access-6j6qf\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.291297 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-5777f78f56-8db7l"] Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.326885 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54db795db7-2zrdk"] Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.326941 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5777f78f56-8db7l"] Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.327088 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.349485 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-neutron-dockercfg-7kxzd\"" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.350127 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-neutron-ovndbs\"" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.350484 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-httpd-config\"" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.350854 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-config\"" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.373017 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-swift-storage-0\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.373061 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-nb\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.373102 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-sb\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.373155 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-config\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.373267 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-svc\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.373306 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j6qf\" (UniqueName: \"kubernetes.io/projected/13b63abc-53cc-433b-8635-2cc87fed8052-kube-api-access-6j6qf\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.384696 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-swift-storage-0\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.389407 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-nb\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.393229 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-sb\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.393433 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-svc\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.395796 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-config\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.431921 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j6qf\" (UniqueName: \"kubernetes.io/projected/13b63abc-53cc-433b-8635-2cc87fed8052-kube-api-access-6j6qf\") pod \"dnsmasq-dns-54db795db7-2zrdk\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.475446 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-config\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.475939 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-combined-ca-bundle\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.475968 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcv5\" (UniqueName: \"kubernetes.io/projected/0effc02a-2b10-417e-b686-635f9578dfb7-kube-api-access-2kcv5\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.476025 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-httpd-config\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.476058 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-ovndb-tls-certs\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.578159 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-config\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.578252 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-combined-ca-bundle\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.578274 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kcv5\" (UniqueName: \"kubernetes.io/projected/0effc02a-2b10-417e-b686-635f9578dfb7-kube-api-access-2kcv5\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.578307 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-httpd-config\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.578332 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-ovndb-tls-certs\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.586556 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-combined-ca-bundle\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.589156 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-httpd-config\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.589825 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-config\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.617439 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-ovndb-tls-certs\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.619175 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kcv5\" (UniqueName: \"kubernetes.io/projected/0effc02a-2b10-417e-b686-635f9578dfb7-kube-api-access-2kcv5\") pod \"neutron-5777f78f56-8db7l\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.676343 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.715453 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.767603 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d749f55c5-g5s9f"] Nov 26 13:51:37 crc kubenswrapper[5200]: W1126 13:51:37.783958 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6906c5bb_d665_40c3_a47b_cb8d169764c3.slice/crio-8267e7f0deb2cbbba23901a33e7763dee6c7ef7e49c05738e0d8eb478e85f104 WatchSource:0}: Error finding container 8267e7f0deb2cbbba23901a33e7763dee6c7ef7e49c05738e0d8eb478e85f104: Status 404 returned error can't find the container with id 8267e7f0deb2cbbba23901a33e7763dee6c7ef7e49c05738e0d8eb478e85f104 Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.804349 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.897859 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjgcm\" (UniqueName: \"kubernetes.io/projected/11f3ae28-91c1-48d2-a885-f08d58e3a272-kube-api-access-zjgcm\") pod \"11f3ae28-91c1-48d2-a885-f08d58e3a272\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.898065 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-nb\") pod \"11f3ae28-91c1-48d2-a885-f08d58e3a272\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.898287 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-svc\") pod \"11f3ae28-91c1-48d2-a885-f08d58e3a272\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.898357 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-sb\") pod \"11f3ae28-91c1-48d2-a885-f08d58e3a272\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.898462 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-swift-storage-0\") pod \"11f3ae28-91c1-48d2-a885-f08d58e3a272\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.898575 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-config\") pod \"11f3ae28-91c1-48d2-a885-f08d58e3a272\" (UID: \"11f3ae28-91c1-48d2-a885-f08d58e3a272\") " Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.940977 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f3ae28-91c1-48d2-a885-f08d58e3a272-kube-api-access-zjgcm" (OuterVolumeSpecName: "kube-api-access-zjgcm") pod "11f3ae28-91c1-48d2-a885-f08d58e3a272" (UID: "11f3ae28-91c1-48d2-a885-f08d58e3a272"). InnerVolumeSpecName "kube-api-access-zjgcm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.971994 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11f3ae28-91c1-48d2-a885-f08d58e3a272" (UID: "11f3ae28-91c1-48d2-a885-f08d58e3a272"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.993110 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-config" (OuterVolumeSpecName: "config") pod "11f3ae28-91c1-48d2-a885-f08d58e3a272" (UID: "11f3ae28-91c1-48d2-a885-f08d58e3a272"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:37 crc kubenswrapper[5200]: I1126 13:51:37.998562 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11f3ae28-91c1-48d2-a885-f08d58e3a272" (UID: "11f3ae28-91c1-48d2-a885-f08d58e3a272"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.000104 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "11f3ae28-91c1-48d2-a885-f08d58e3a272" (UID: "11f3ae28-91c1-48d2-a885-f08d58e3a272"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.003818 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.005967 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.006073 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zjgcm\" (UniqueName: \"kubernetes.io/projected/11f3ae28-91c1-48d2-a885-f08d58e3a272-kube-api-access-zjgcm\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.006146 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.006207 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.036272 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11f3ae28-91c1-48d2-a885-f08d58e3a272" (UID: "11f3ae28-91c1-48d2-a885-f08d58e3a272"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.038319 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcbdf5458-6txbv" event={"ID":"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392","Type":"ContainerStarted","Data":"a1dea5d541937c30fc574b8d069744ef95d747ea3e03dc28c29d3c404bef2735"} Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.040298 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.040333 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.052912 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" event={"ID":"6906c5bb-d665-40c3-a47b-cb8d169764c3","Type":"ContainerStarted","Data":"8267e7f0deb2cbbba23901a33e7763dee6c7ef7e49c05738e0d8eb478e85f104"} Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.077474 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.089102 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" event={"ID":"11f3ae28-91c1-48d2-a885-f08d58e3a272","Type":"ContainerDied","Data":"369a3823832e31ed171c207e313b9ca5a180c8c28f22b0c4f10b0e70f3ff41e5"} Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.089174 5200 scope.go:117] "RemoveContainer" containerID="6d3d4b43b75f7bb17052000ff85de1f6ea2ef261540c85c633cead1061f1e530" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.090249 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74654ff48c-9vkqv" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.107505 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7dcbdf5458-6txbv" podStartSLOduration=4.10748179 podStartE2EDuration="4.10748179s" podCreationTimestamp="2025-11-26 13:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:38.079386856 +0000 UTC m=+1266.758588674" watchObservedRunningTime="2025-11-26 13:51:38.10748179 +0000 UTC m=+1266.786683608" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.126949 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f3ae28-91c1-48d2-a885-f08d58e3a272-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.128768 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51b758c5-ab01-4e10-98dd-b668f6f64d4a","Type":"ContainerStarted","Data":"caae443626d95679fc346f165c28ec27aa9bc3382717f1b50e64877b253fc114"} Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.341015 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74654ff48c-9vkqv"] Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.353878 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74654ff48c-9vkqv"] Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.414189 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54db795db7-2zrdk"] Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.636598 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5777f78f56-8db7l"] Nov 26 13:51:38 crc kubenswrapper[5200]: I1126 13:51:38.988637 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f3ae28-91c1-48d2-a885-f08d58e3a272" path="/var/lib/kubelet/pods/11f3ae28-91c1-48d2-a885-f08d58e3a272/volumes" Nov 26 13:51:39 crc kubenswrapper[5200]: W1126 13:51:39.092325 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0effc02a_2b10_417e_b686_635f9578dfb7.slice/crio-0b9815d881e963dacf0a891cbc1d8788bd297d803eafecb9ab46f1ed9dc2c3ac WatchSource:0}: Error finding container 0b9815d881e963dacf0a891cbc1d8788bd297d803eafecb9ab46f1ed9dc2c3ac: Status 404 returned error can't find the container with id 0b9815d881e963dacf0a891cbc1d8788bd297d803eafecb9ab46f1ed9dc2c3ac Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.158152 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f219e39d-b7f5-42db-800b-84a40c27eb82","Type":"ContainerStarted","Data":"64a8811c7077799837f2727c1c9aaccd208c8542ac3791bce01ee38bfba291d6"} Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.163283 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" event={"ID":"13b63abc-53cc-433b-8635-2cc87fed8052","Type":"ContainerStarted","Data":"43051443c94310af51f1987fda355e6d1a47e1f7d30e36e600d1c14cfb629cc2"} Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.164916 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5777f78f56-8db7l" event={"ID":"0effc02a-2b10-417e-b686-635f9578dfb7","Type":"ContainerStarted","Data":"0b9815d881e963dacf0a891cbc1d8788bd297d803eafecb9ab46f1ed9dc2c3ac"} Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.166258 5200 generic.go:358] "Generic (PLEG): container finished" podID="6906c5bb-d665-40c3-a47b-cb8d169764c3" containerID="0220a4b14e28a7a0c39cdeda222957e8b9feb37dfed26d1775d9e8e409891259" exitCode=0 Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.166328 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" event={"ID":"6906c5bb-d665-40c3-a47b-cb8d169764c3","Type":"ContainerDied","Data":"0220a4b14e28a7a0c39cdeda222957e8b9feb37dfed26d1775d9e8e409891259"} Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.863545 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.871166 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs6m4\" (UniqueName: \"kubernetes.io/projected/6906c5bb-d665-40c3-a47b-cb8d169764c3-kube-api-access-rs6m4\") pod \"6906c5bb-d665-40c3-a47b-cb8d169764c3\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.871238 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-nb\") pod \"6906c5bb-d665-40c3-a47b-cb8d169764c3\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.871285 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-swift-storage-0\") pod \"6906c5bb-d665-40c3-a47b-cb8d169764c3\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.871525 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-svc\") pod \"6906c5bb-d665-40c3-a47b-cb8d169764c3\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.871920 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-config\") pod \"6906c5bb-d665-40c3-a47b-cb8d169764c3\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.871957 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-sb\") pod \"6906c5bb-d665-40c3-a47b-cb8d169764c3\" (UID: \"6906c5bb-d665-40c3-a47b-cb8d169764c3\") " Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.882298 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6906c5bb-d665-40c3-a47b-cb8d169764c3-kube-api-access-rs6m4" (OuterVolumeSpecName: "kube-api-access-rs6m4") pod "6906c5bb-d665-40c3-a47b-cb8d169764c3" (UID: "6906c5bb-d665-40c3-a47b-cb8d169764c3"). InnerVolumeSpecName "kube-api-access-rs6m4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.932078 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6906c5bb-d665-40c3-a47b-cb8d169764c3" (UID: "6906c5bb-d665-40c3-a47b-cb8d169764c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.953983 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.974847 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rs6m4\" (UniqueName: \"kubernetes.io/projected/6906c5bb-d665-40c3-a47b-cb8d169764c3-kube-api-access-rs6m4\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:39 crc kubenswrapper[5200]: I1126 13:51:39.974910 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.117202 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6906c5bb-d665-40c3-a47b-cb8d169764c3" (UID: "6906c5bb-d665-40c3-a47b-cb8d169764c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.117997 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6906c5bb-d665-40c3-a47b-cb8d169764c3" (UID: "6906c5bb-d665-40c3-a47b-cb8d169764c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.129086 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6906c5bb-d665-40c3-a47b-cb8d169764c3" (UID: "6906c5bb-d665-40c3-a47b-cb8d169764c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.147003 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-config" (OuterVolumeSpecName: "config") pod "6906c5bb-d665-40c3-a47b-cb8d169764c3" (UID: "6906c5bb-d665-40c3-a47b-cb8d169764c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.185139 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.185178 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.185191 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.185202 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6906c5bb-d665-40c3-a47b-cb8d169764c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.201439 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" event={"ID":"6906c5bb-d665-40c3-a47b-cb8d169764c3","Type":"ContainerDied","Data":"8267e7f0deb2cbbba23901a33e7763dee6c7ef7e49c05738e0d8eb478e85f104"} Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.201500 5200 scope.go:117] "RemoveContainer" containerID="0220a4b14e28a7a0c39cdeda222957e8b9feb37dfed26d1775d9e8e409891259" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.201640 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d749f55c5-g5s9f" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.208631 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f219e39d-b7f5-42db-800b-84a40c27eb82","Type":"ContainerStarted","Data":"75082115eef574726d2a40ff772f41a12d7991ab0b3b9db9bee64d3ebc8f5e91"} Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.317385 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.319012 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.384707 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.386034 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.499309 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d749f55c5-g5s9f"] Nov 26 13:51:40 crc kubenswrapper[5200]: I1126 13:51:40.507123 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d749f55c5-g5s9f"] Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.014249 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6906c5bb-d665-40c3-a47b-cb8d169764c3" path="/var/lib/kubelet/pods/6906c5bb-d665-40c3-a47b-cb8d169764c3/volumes" Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.242890 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7544499f5f-8pt52" event={"ID":"0a85e23d-55dd-4d11-b304-aeb59804f47d","Type":"ContainerStarted","Data":"da8e534c170ae59a27fe6be7bcd6d2dc6851930207b80fde5ade7f1d41b386bc"} Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.242960 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7544499f5f-8pt52" event={"ID":"0a85e23d-55dd-4d11-b304-aeb59804f47d","Type":"ContainerStarted","Data":"fea23ba732bf069b37174feb4444f8fe8af7b5feb6c583a3ca3c7cf599079d2c"} Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.245758 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51b758c5-ab01-4e10-98dd-b668f6f64d4a","Type":"ContainerStarted","Data":"1649c4020dafd54120f86d8699e17d9c5746dd0aa90fbea37a9e7663c72fed12"} Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.250441 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" event={"ID":"5dd57521-dfcc-4303-9f37-3485c7407851","Type":"ContainerStarted","Data":"6c6a86eadcfcca89403d03d1232bb2037cc1a1a3bd1de3d050a23ad833ce0858"} Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.250481 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" event={"ID":"5dd57521-dfcc-4303-9f37-3485c7407851","Type":"ContainerStarted","Data":"f9fa9adbbf6d1354d89225638d8e5a8383e2d9b6d0d928de2d7dc4db987750c0"} Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.257281 5200 generic.go:358] "Generic (PLEG): container finished" podID="13b63abc-53cc-433b-8635-2cc87fed8052" containerID="337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650" exitCode=0 Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.257601 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" event={"ID":"13b63abc-53cc-433b-8635-2cc87fed8052","Type":"ContainerDied","Data":"337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650"} Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.267587 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5777f78f56-8db7l" event={"ID":"0effc02a-2b10-417e-b686-635f9578dfb7","Type":"ContainerStarted","Data":"cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7"} Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.267651 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5777f78f56-8db7l" event={"ID":"0effc02a-2b10-417e-b686-635f9578dfb7","Type":"ContainerStarted","Data":"2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a"} Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.268603 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.268664 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.269291 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7544499f5f-8pt52" podStartSLOduration=3.027191539 podStartE2EDuration="7.269278642s" podCreationTimestamp="2025-11-26 13:51:34 +0000 UTC" firstStartedPulling="2025-11-26 13:51:35.738057672 +0000 UTC m=+1264.417259490" lastFinishedPulling="2025-11-26 13:51:39.980144775 +0000 UTC m=+1268.659346593" observedRunningTime="2025-11-26 13:51:41.26205203 +0000 UTC m=+1269.941253858" watchObservedRunningTime="2025-11-26 13:51:41.269278642 +0000 UTC m=+1269.948480470" Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.461727 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" podStartSLOduration=3.277973463 podStartE2EDuration="7.461692681s" podCreationTimestamp="2025-11-26 13:51:34 +0000 UTC" firstStartedPulling="2025-11-26 13:51:35.748588981 +0000 UTC m=+1264.427790799" lastFinishedPulling="2025-11-26 13:51:39.932308199 +0000 UTC m=+1268.611510017" observedRunningTime="2025-11-26 13:51:41.320718701 +0000 UTC m=+1269.999920519" watchObservedRunningTime="2025-11-26 13:51:41.461692681 +0000 UTC m=+1270.140894499" Nov 26 13:51:41 crc kubenswrapper[5200]: I1126 13:51:41.465379 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5777f78f56-8db7l" podStartSLOduration=4.465368918 podStartE2EDuration="4.465368918s" podCreationTimestamp="2025-11-26 13:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:41.362722593 +0000 UTC m=+1270.041924401" watchObservedRunningTime="2025-11-26 13:51:41.465368918 +0000 UTC m=+1270.144570736" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.020144 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.020518 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.093527 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.127182 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.303324 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51b758c5-ab01-4e10-98dd-b668f6f64d4a","Type":"ContainerStarted","Data":"117490d78a8f8035872de33fdb583986c7d358214bd8254902762fe0ca83dc5e"} Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.317451 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f219e39d-b7f5-42db-800b-84a40c27eb82","Type":"ContainerStarted","Data":"2c5b9c5d8acaa325423bad47c1a2cc971b04f556bed04e1e17377dbfb3addfbf"} Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.317702 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api-log" containerID="cri-o://75082115eef574726d2a40ff772f41a12d7991ab0b3b9db9bee64d3ebc8f5e91" gracePeriod=30 Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.318035 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api" containerID="cri-o://2c5b9c5d8acaa325423bad47c1a2cc971b04f556bed04e1e17377dbfb3addfbf" gracePeriod=30 Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.318044 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/cinder-api-0" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.346766 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" event={"ID":"13b63abc-53cc-433b-8635-2cc87fed8052","Type":"ContainerStarted","Data":"9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5"} Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.346919 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.347081 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.347514 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.349163 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.240007466 podStartE2EDuration="7.34913796s" podCreationTimestamp="2025-11-26 13:51:35 +0000 UTC" firstStartedPulling="2025-11-26 13:51:37.383060924 +0000 UTC m=+1266.062262742" lastFinishedPulling="2025-11-26 13:51:38.492191418 +0000 UTC m=+1267.171393236" observedRunningTime="2025-11-26 13:51:42.329672705 +0000 UTC m=+1271.008874533" watchObservedRunningTime="2025-11-26 13:51:42.34913796 +0000 UTC m=+1271.028339778" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.353758 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.385507 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.385484802 podStartE2EDuration="6.385484802s" podCreationTimestamp="2025-11-26 13:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:42.366388067 +0000 UTC m=+1271.045589885" watchObservedRunningTime="2025-11-26 13:51:42.385484802 +0000 UTC m=+1271.064686620" Nov 26 13:51:42 crc kubenswrapper[5200]: I1126 13:51:42.422245 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" podStartSLOduration=5.422214324 podStartE2EDuration="5.422214324s" podCreationTimestamp="2025-11-26 13:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:42.390159716 +0000 UTC m=+1271.069361544" watchObservedRunningTime="2025-11-26 13:51:42.422214324 +0000 UTC m=+1271.101416142" Nov 26 13:51:43 crc kubenswrapper[5200]: I1126 13:51:43.397556 5200 generic.go:358] "Generic (PLEG): container finished" podID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerID="2c5b9c5d8acaa325423bad47c1a2cc971b04f556bed04e1e17377dbfb3addfbf" exitCode=0 Nov 26 13:51:43 crc kubenswrapper[5200]: I1126 13:51:43.397998 5200 generic.go:358] "Generic (PLEG): container finished" podID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerID="75082115eef574726d2a40ff772f41a12d7991ab0b3b9db9bee64d3ebc8f5e91" exitCode=143 Nov 26 13:51:43 crc kubenswrapper[5200]: I1126 13:51:43.399573 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f219e39d-b7f5-42db-800b-84a40c27eb82","Type":"ContainerDied","Data":"2c5b9c5d8acaa325423bad47c1a2cc971b04f556bed04e1e17377dbfb3addfbf"} Nov 26 13:51:43 crc kubenswrapper[5200]: I1126 13:51:43.399675 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f219e39d-b7f5-42db-800b-84a40c27eb82","Type":"ContainerDied","Data":"75082115eef574726d2a40ff772f41a12d7991ab0b3b9db9bee64d3ebc8f5e91"} Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.083532 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.084123 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 13:51:44 crc kubenswrapper[5200]: E1126 13:51:44.099904 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1619008 actualBytes=10240 Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.219757 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-747455c75f-8zrqn"] Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.221159 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11f3ae28-91c1-48d2-a885-f08d58e3a272" containerName="init" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.221178 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f3ae28-91c1-48d2-a885-f08d58e3a272" containerName="init" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.221244 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6906c5bb-d665-40c3-a47b-cb8d169764c3" containerName="init" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.221250 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="6906c5bb-d665-40c3-a47b-cb8d169764c3" containerName="init" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.221414 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="11f3ae28-91c1-48d2-a885-f08d58e3a272" containerName="init" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.221436 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="6906c5bb-d665-40c3-a47b-cb8d169764c3" containerName="init" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.243717 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.243887 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.250116 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-747455c75f-8zrqn"] Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.254942 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-neutron-public-svc\"" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.255835 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-neutron-internal-svc\"" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.333057 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-public-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.333134 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-httpd-config\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.333272 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-ovndb-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.333636 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-combined-ca-bundle\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.333827 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-internal-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.334004 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6m5\" (UniqueName: \"kubernetes.io/projected/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-kube-api-access-8x6m5\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.334047 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-config\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.413304 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.413817 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.436117 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-combined-ca-bundle\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.436195 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-internal-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.436253 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6m5\" (UniqueName: \"kubernetes.io/projected/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-kube-api-access-8x6m5\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.436274 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-config\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.436330 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-public-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.436351 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-httpd-config\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.436385 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-ovndb-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.447135 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-httpd-config\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.458324 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-ovndb-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.458769 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-internal-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.459315 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-public-tls-certs\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.469039 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-combined-ca-bundle\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.469747 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-config\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.479189 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6m5\" (UniqueName: \"kubernetes.io/projected/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-kube-api-access-8x6m5\") pod \"neutron-747455c75f-8zrqn\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.584909 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:44 crc kubenswrapper[5200]: I1126 13:51:44.991443 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.257987 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.604059 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d9fcdcd8b-tl47q"] Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.614475 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.630569 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-barbican-public-svc\"" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.630812 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-barbican-internal-svc\"" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.643519 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d9fcdcd8b-tl47q"] Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.681189 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjxt5\" (UniqueName: \"kubernetes.io/projected/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-kube-api-access-vjxt5\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.681280 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-combined-ca-bundle\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.681317 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-logs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.681344 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.681365 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-internal-tls-certs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.681385 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data-custom\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.681891 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-public-tls-certs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.789140 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-public-tls-certs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.789309 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjxt5\" (UniqueName: \"kubernetes.io/projected/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-kube-api-access-vjxt5\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.789367 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-combined-ca-bundle\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.789419 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-logs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.789450 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.789469 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-internal-tls-certs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.789496 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data-custom\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.793813 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-logs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.798446 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-public-tls-certs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.798498 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-combined-ca-bundle\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.801726 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.802283 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-internal-tls-certs\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.804214 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data-custom\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.815344 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjxt5\" (UniqueName: \"kubernetes.io/projected/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-kube-api-access-vjxt5\") pod \"barbican-api-5d9fcdcd8b-tl47q\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:45 crc kubenswrapper[5200]: I1126 13:51:45.987425 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:46 crc kubenswrapper[5200]: I1126 13:51:46.292286 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 26 13:51:46 crc kubenswrapper[5200]: I1126 13:51:46.601898 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 26 13:51:46 crc kubenswrapper[5200]: I1126 13:51:46.655564 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:47 crc kubenswrapper[5200]: I1126 13:51:47.165468 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:47 crc kubenswrapper[5200]: I1126 13:51:47.263772 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:51:47 crc kubenswrapper[5200]: I1126 13:51:47.475220 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerName="cinder-scheduler" containerID="cri-o://1649c4020dafd54120f86d8699e17d9c5746dd0aa90fbea37a9e7663c72fed12" gracePeriod=30 Nov 26 13:51:47 crc kubenswrapper[5200]: I1126 13:51:47.475744 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerName="probe" containerID="cri-o://117490d78a8f8035872de33fdb583986c7d358214bd8254902762fe0ca83dc5e" gracePeriod=30 Nov 26 13:51:48 crc kubenswrapper[5200]: I1126 13:51:48.403030 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:51:48 crc kubenswrapper[5200]: I1126 13:51:48.527543 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f494c55f-m592t"] Nov 26 13:51:48 crc kubenswrapper[5200]: I1126 13:51:48.528102 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79f494c55f-m592t" podUID="1247eec6-f811-4028-9e60-ee2e95526022" containerName="dnsmasq-dns" containerID="cri-o://def7f66910a2c22d2dc2809f86bc8ccad8aa9f0be671fc91e2d65c9c463fa23b" gracePeriod=10 Nov 26 13:51:48 crc kubenswrapper[5200]: I1126 13:51:48.552470 5200 generic.go:358] "Generic (PLEG): container finished" podID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerID="117490d78a8f8035872de33fdb583986c7d358214bd8254902762fe0ca83dc5e" exitCode=0 Nov 26 13:51:48 crc kubenswrapper[5200]: I1126 13:51:48.552826 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51b758c5-ab01-4e10-98dd-b668f6f64d4a","Type":"ContainerDied","Data":"117490d78a8f8035872de33fdb583986c7d358214bd8254902762fe0ca83dc5e"} Nov 26 13:51:49 crc kubenswrapper[5200]: I1126 13:51:49.568008 5200 generic.go:358] "Generic (PLEG): container finished" podID="1247eec6-f811-4028-9e60-ee2e95526022" containerID="def7f66910a2c22d2dc2809f86bc8ccad8aa9f0be671fc91e2d65c9c463fa23b" exitCode=0 Nov 26 13:51:49 crc kubenswrapper[5200]: I1126 13:51:49.568110 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f494c55f-m592t" event={"ID":"1247eec6-f811-4028-9e60-ee2e95526022","Type":"ContainerDied","Data":"def7f66910a2c22d2dc2809f86bc8ccad8aa9f0be671fc91e2d65c9c463fa23b"} Nov 26 13:51:49 crc kubenswrapper[5200]: I1126 13:51:49.573170 5200 generic.go:358] "Generic (PLEG): container finished" podID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerID="1649c4020dafd54120f86d8699e17d9c5746dd0aa90fbea37a9e7663c72fed12" exitCode=0 Nov 26 13:51:49 crc kubenswrapper[5200]: I1126 13:51:49.573246 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51b758c5-ab01-4e10-98dd-b668f6f64d4a","Type":"ContainerDied","Data":"1649c4020dafd54120f86d8699e17d9c5746dd0aa90fbea37a9e7663c72fed12"} Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.410232 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.436100 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.496728 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-combined-ca-bundle\") pod \"f219e39d-b7f5-42db-800b-84a40c27eb82\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.497188 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data-custom\") pod \"f219e39d-b7f5-42db-800b-84a40c27eb82\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.497209 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-scripts\") pod \"f219e39d-b7f5-42db-800b-84a40c27eb82\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.497294 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data\") pod \"f219e39d-b7f5-42db-800b-84a40c27eb82\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.497514 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f219e39d-b7f5-42db-800b-84a40c27eb82-logs\") pod \"f219e39d-b7f5-42db-800b-84a40c27eb82\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.497805 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f219e39d-b7f5-42db-800b-84a40c27eb82-etc-machine-id\") pod \"f219e39d-b7f5-42db-800b-84a40c27eb82\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.497931 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nljw4\" (UniqueName: \"kubernetes.io/projected/f219e39d-b7f5-42db-800b-84a40c27eb82-kube-api-access-nljw4\") pod \"f219e39d-b7f5-42db-800b-84a40c27eb82\" (UID: \"f219e39d-b7f5-42db-800b-84a40c27eb82\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.500454 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f219e39d-b7f5-42db-800b-84a40c27eb82-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f219e39d-b7f5-42db-800b-84a40c27eb82" (UID: "f219e39d-b7f5-42db-800b-84a40c27eb82"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.500930 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f219e39d-b7f5-42db-800b-84a40c27eb82-logs" (OuterVolumeSpecName: "logs") pod "f219e39d-b7f5-42db-800b-84a40c27eb82" (UID: "f219e39d-b7f5-42db-800b-84a40c27eb82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.513137 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f219e39d-b7f5-42db-800b-84a40c27eb82-kube-api-access-nljw4" (OuterVolumeSpecName: "kube-api-access-nljw4") pod "f219e39d-b7f5-42db-800b-84a40c27eb82" (UID: "f219e39d-b7f5-42db-800b-84a40c27eb82"). InnerVolumeSpecName "kube-api-access-nljw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.520928 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f219e39d-b7f5-42db-800b-84a40c27eb82" (UID: "f219e39d-b7f5-42db-800b-84a40c27eb82"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.521585 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-scripts" (OuterVolumeSpecName: "scripts") pod "f219e39d-b7f5-42db-800b-84a40c27eb82" (UID: "f219e39d-b7f5-42db-800b-84a40c27eb82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.557208 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f219e39d-b7f5-42db-800b-84a40c27eb82" (UID: "f219e39d-b7f5-42db-800b-84a40c27eb82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.606205 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data-custom\") pod \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.606510 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-scripts\") pod \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.606794 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data\") pod \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.606922 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51b758c5-ab01-4e10-98dd-b668f6f64d4a-etc-machine-id\") pod \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.606970 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-combined-ca-bundle\") pod \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.607033 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz95j\" (UniqueName: \"kubernetes.io/projected/51b758c5-ab01-4e10-98dd-b668f6f64d4a-kube-api-access-wz95j\") pod \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\" (UID: \"51b758c5-ab01-4e10-98dd-b668f6f64d4a\") " Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.608789 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51b758c5-ab01-4e10-98dd-b668f6f64d4a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "51b758c5-ab01-4e10-98dd-b668f6f64d4a" (UID: "51b758c5-ab01-4e10-98dd-b668f6f64d4a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.613907 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "51b758c5-ab01-4e10-98dd-b668f6f64d4a" (UID: "51b758c5-ab01-4e10-98dd-b668f6f64d4a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.614794 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.614811 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.614825 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f219e39d-b7f5-42db-800b-84a40c27eb82-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.614835 5200 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51b758c5-ab01-4e10-98dd-b668f6f64d4a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.614847 5200 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f219e39d-b7f5-42db-800b-84a40c27eb82-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.614857 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nljw4\" (UniqueName: \"kubernetes.io/projected/f219e39d-b7f5-42db-800b-84a40c27eb82-kube-api-access-nljw4\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.614871 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.614881 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.618609 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b758c5-ab01-4e10-98dd-b668f6f64d4a-kube-api-access-wz95j" (OuterVolumeSpecName: "kube-api-access-wz95j") pod "51b758c5-ab01-4e10-98dd-b668f6f64d4a" (UID: "51b758c5-ab01-4e10-98dd-b668f6f64d4a"). InnerVolumeSpecName "kube-api-access-wz95j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.627146 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.627170 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51b758c5-ab01-4e10-98dd-b668f6f64d4a","Type":"ContainerDied","Data":"caae443626d95679fc346f165c28ec27aa9bc3382717f1b50e64877b253fc114"} Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.627262 5200 scope.go:117] "RemoveContainer" containerID="117490d78a8f8035872de33fdb583986c7d358214bd8254902762fe0ca83dc5e" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.629709 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-scripts" (OuterVolumeSpecName: "scripts") pod "51b758c5-ab01-4e10-98dd-b668f6f64d4a" (UID: "51b758c5-ab01-4e10-98dd-b668f6f64d4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.633672 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data" (OuterVolumeSpecName: "config-data") pod "f219e39d-b7f5-42db-800b-84a40c27eb82" (UID: "f219e39d-b7f5-42db-800b-84a40c27eb82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.647071 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f219e39d-b7f5-42db-800b-84a40c27eb82","Type":"ContainerDied","Data":"64a8811c7077799837f2727c1c9aaccd208c8542ac3791bce01ee38bfba291d6"} Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.647212 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.706791 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51b758c5-ab01-4e10-98dd-b668f6f64d4a" (UID: "51b758c5-ab01-4e10-98dd-b668f6f64d4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.716940 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f219e39d-b7f5-42db-800b-84a40c27eb82-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.716984 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.716999 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.717013 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wz95j\" (UniqueName: \"kubernetes.io/projected/51b758c5-ab01-4e10-98dd-b668f6f64d4a-kube-api-access-wz95j\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.732492 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data" (OuterVolumeSpecName: "config-data") pod "51b758c5-ab01-4e10-98dd-b668f6f64d4a" (UID: "51b758c5-ab01-4e10-98dd-b668f6f64d4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.830834 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51b758c5-ab01-4e10-98dd-b668f6f64d4a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.834039 5200 scope.go:117] "RemoveContainer" containerID="1649c4020dafd54120f86d8699e17d9c5746dd0aa90fbea37a9e7663c72fed12" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.859281 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.870475 5200 scope.go:117] "RemoveContainer" containerID="2c5b9c5d8acaa325423bad47c1a2cc971b04f556bed04e1e17377dbfb3addfbf" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.874666 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.883221 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.919443 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921158 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerName="probe" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921188 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerName="probe" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921225 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921234 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921254 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1247eec6-f811-4028-9e60-ee2e95526022" containerName="dnsmasq-dns" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921263 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1247eec6-f811-4028-9e60-ee2e95526022" containerName="dnsmasq-dns" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921283 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1247eec6-f811-4028-9e60-ee2e95526022" containerName="init" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921291 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1247eec6-f811-4028-9e60-ee2e95526022" containerName="init" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921314 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api-log" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921323 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api-log" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921341 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerName="cinder-scheduler" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921351 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerName="cinder-scheduler" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921608 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921631 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerName="cinder-scheduler" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921650 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" containerName="probe" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921666 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1247eec6-f811-4028-9e60-ee2e95526022" containerName="dnsmasq-dns" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.921676 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api-log" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.923869 5200 scope.go:117] "RemoveContainer" containerID="75082115eef574726d2a40ff772f41a12d7991ab0b3b9db9bee64d3ebc8f5e91" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.946624 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.951399 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-cinder-dockercfg-xx5md\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.952107 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-config-data\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.952266 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scripts\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.952134 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-cinder-public-svc\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.952873 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-api-config-data\"" Nov 26 13:51:50 crc kubenswrapper[5200]: I1126 13:51:50.953061 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-cinder-internal-svc\"" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.040506 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-swift-storage-0\") pod \"1247eec6-f811-4028-9e60-ee2e95526022\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.040615 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-nb\") pod \"1247eec6-f811-4028-9e60-ee2e95526022\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.040751 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-svc\") pod \"1247eec6-f811-4028-9e60-ee2e95526022\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.040975 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-sb\") pod \"1247eec6-f811-4028-9e60-ee2e95526022\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.041221 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-config\") pod \"1247eec6-f811-4028-9e60-ee2e95526022\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.041637 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md6fr\" (UniqueName: \"kubernetes.io/projected/1247eec6-f811-4028-9e60-ee2e95526022-kube-api-access-md6fr\") pod \"1247eec6-f811-4028-9e60-ee2e95526022\" (UID: \"1247eec6-f811-4028-9e60-ee2e95526022\") " Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.049241 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1247eec6-f811-4028-9e60-ee2e95526022-kube-api-access-md6fr" (OuterVolumeSpecName: "kube-api-access-md6fr") pod "1247eec6-f811-4028-9e60-ee2e95526022" (UID: "1247eec6-f811-4028-9e60-ee2e95526022"). InnerVolumeSpecName "kube-api-access-md6fr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.069983 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" path="/var/lib/kubelet/pods/f219e39d-b7f5-42db-800b-84a40c27eb82/volumes" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.077344 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.109989 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1247eec6-f811-4028-9e60-ee2e95526022" (UID: "1247eec6-f811-4028-9e60-ee2e95526022"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.118294 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.120989 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1247eec6-f811-4028-9e60-ee2e95526022" (UID: "1247eec6-f811-4028-9e60-ee2e95526022"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.127581 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.138468 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.144716 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92np6\" (UniqueName: \"kubernetes.io/projected/ff2e56e0-4f2e-4101-b051-111cb2cbee56-kube-api-access-92np6\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.144801 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.144820 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.144847 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-scripts\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.144896 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.144963 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data-custom\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.144980 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.144998 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff2e56e0-4f2e-4101-b051-111cb2cbee56-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.145029 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2e56e0-4f2e-4101-b051-111cb2cbee56-logs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.145095 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-md6fr\" (UniqueName: \"kubernetes.io/projected/1247eec6-f811-4028-9e60-ee2e95526022-kube-api-access-md6fr\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.145106 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.145115 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.148700 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1247eec6-f811-4028-9e60-ee2e95526022" (UID: "1247eec6-f811-4028-9e60-ee2e95526022"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.151757 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.153080 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.157087 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scheduler-config-data\"" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.158920 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1247eec6-f811-4028-9e60-ee2e95526022" (UID: "1247eec6-f811-4028-9e60-ee2e95526022"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.162967 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-747455c75f-8zrqn"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.170561 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-config" (OuterVolumeSpecName: "config") pod "1247eec6-f811-4028-9e60-ee2e95526022" (UID: "1247eec6-f811-4028-9e60-ee2e95526022"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.175780 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d9fcdcd8b-tl47q"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.247336 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92np6\" (UniqueName: \"kubernetes.io/projected/ff2e56e0-4f2e-4101-b051-111cb2cbee56-kube-api-access-92np6\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.247436 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-scripts\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.247494 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.247730 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.247857 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.247988 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-scripts\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.248132 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.248342 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.248470 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.248718 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7163ac1c-74a2-4383-bb8b-21a320eb6863-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.249024 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data-custom\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.249155 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.249294 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff2e56e0-4f2e-4101-b051-111cb2cbee56-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.249471 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdflj\" (UniqueName: \"kubernetes.io/projected/7163ac1c-74a2-4383-bb8b-21a320eb6863-kube-api-access-qdflj\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.249625 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2e56e0-4f2e-4101-b051-111cb2cbee56-logs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.249939 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.250077 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.250205 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1247eec6-f811-4028-9e60-ee2e95526022-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.250700 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff2e56e0-4f2e-4101-b051-111cb2cbee56-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.250833 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2e56e0-4f2e-4101-b051-111cb2cbee56-logs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.256006 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.256576 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.256901 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.257305 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data-custom\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.258387 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.264224 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-scripts\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.268807 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92np6\" (UniqueName: \"kubernetes.io/projected/ff2e56e0-4f2e-4101-b051-111cb2cbee56-kube-api-access-92np6\") pod \"cinder-api-0\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.353113 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.353161 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.353200 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7163ac1c-74a2-4383-bb8b-21a320eb6863-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.353256 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdflj\" (UniqueName: \"kubernetes.io/projected/7163ac1c-74a2-4383-bb8b-21a320eb6863-kube-api-access-qdflj\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.353322 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-scripts\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.353324 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7163ac1c-74a2-4383-bb8b-21a320eb6863-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.353347 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.356866 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.357950 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-scripts\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.358329 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.359632 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.360521 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.371850 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdflj\" (UniqueName: \"kubernetes.io/projected/7163ac1c-74a2-4383-bb8b-21a320eb6863-kube-api-access-qdflj\") pod \"cinder-scheduler-0\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.482216 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.673791 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747455c75f-8zrqn" event={"ID":"c3f6e95d-1b68-4641-a4e5-6bd172c0662e","Type":"ContainerStarted","Data":"d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb"} Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.674106 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747455c75f-8zrqn" event={"ID":"c3f6e95d-1b68-4641-a4e5-6bd172c0662e","Type":"ContainerStarted","Data":"349ff0104322cc4005bff409f30124aa60c888205b3b411fd3fe645f50aa4258"} Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.678400 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" event={"ID":"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59","Type":"ContainerStarted","Data":"a4649f26c430accc7c822a3b6948eaa51389ccaecd8fe7409b8405cd4514eba9"} Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.678453 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" event={"ID":"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59","Type":"ContainerStarted","Data":"80e3e11a46811afe460b1bc7a1b17a7c419c8b0f99bd99bac53fc0d0212ab3ce"} Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.702963 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerStarted","Data":"e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66"} Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.703215 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="ceilometer-central-agent" containerID="cri-o://0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef" gracePeriod=30 Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.703600 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.703877 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="proxy-httpd" containerID="cri-o://e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66" gracePeriod=30 Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.704072 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="ceilometer-notification-agent" containerID="cri-o://ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c" gracePeriod=30 Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.704117 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="sg-core" containerID="cri-o://a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911" gracePeriod=30 Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.721054 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f494c55f-m592t" event={"ID":"1247eec6-f811-4028-9e60-ee2e95526022","Type":"ContainerDied","Data":"3cdc738aab6e7e6721e475a00cf1a75816bbff8b486855e51f17c0e91e9dd5f9"} Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.721120 5200 scope.go:117] "RemoveContainer" containerID="def7f66910a2c22d2dc2809f86bc8ccad8aa9f0be671fc91e2d65c9c463fa23b" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.721303 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f494c55f-m592t" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.755210 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.585687112 podStartE2EDuration="59.755191276s" podCreationTimestamp="2025-11-26 13:50:52 +0000 UTC" firstStartedPulling="2025-11-26 13:50:53.31036171 +0000 UTC m=+1221.989563528" lastFinishedPulling="2025-11-26 13:51:50.479865874 +0000 UTC m=+1279.159067692" observedRunningTime="2025-11-26 13:51:51.735242838 +0000 UTC m=+1280.414444656" watchObservedRunningTime="2025-11-26 13:51:51.755191276 +0000 UTC m=+1280.434393094" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.787081 5200 scope.go:117] "RemoveContainer" containerID="e1c6eadee1095f9191dc95e8766d873ca66582e5414c9d5c88c857e894141461" Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.831646 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f494c55f-m592t"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.853637 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79f494c55f-m592t"] Nov 26 13:51:51 crc kubenswrapper[5200]: I1126 13:51:51.913863 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:51:51 crc kubenswrapper[5200]: W1126 13:51:51.928640 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff2e56e0_4f2e_4101_b051_111cb2cbee56.slice/crio-f54de4cd84d2e6f4c6259d86e5e2a578b1d331db0f82e096ae21b7207df0c712 WatchSource:0}: Error finding container f54de4cd84d2e6f4c6259d86e5e2a578b1d331db0f82e096ae21b7207df0c712: Status 404 returned error can't find the container with id f54de4cd84d2e6f4c6259d86e5e2a578b1d331db0f82e096ae21b7207df0c712 Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.098165 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:51:52 crc kubenswrapper[5200]: E1126 13:51:52.246967 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d51f5b4_9537_4b72_b1c0_d7c26a94ffc4.slice/crio-conmon-0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d51f5b4_9537_4b72_b1c0_d7c26a94ffc4.slice/crio-0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef.scope\": RecentStats: unable to find data in memory cache]" Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.319667 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f219e39d-b7f5-42db-800b-84a40c27eb82" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.157:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.767203 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" event={"ID":"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59","Type":"ContainerStarted","Data":"c58ee236e1d2b49c25a04da65637b68cbc07d2541bfa6191ac3c1b680febe5ff"} Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.769552 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.769660 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.775134 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ff2e56e0-4f2e-4101-b051-111cb2cbee56","Type":"ContainerStarted","Data":"7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066"} Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.775185 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ff2e56e0-4f2e-4101-b051-111cb2cbee56","Type":"ContainerStarted","Data":"f54de4cd84d2e6f4c6259d86e5e2a578b1d331db0f82e096ae21b7207df0c712"} Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.781974 5200 generic.go:358] "Generic (PLEG): container finished" podID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerID="e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66" exitCode=0 Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.782006 5200 generic.go:358] "Generic (PLEG): container finished" podID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerID="a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911" exitCode=2 Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.782031 5200 generic.go:358] "Generic (PLEG): container finished" podID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerID="0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef" exitCode=0 Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.782130 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerDied","Data":"e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66"} Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.782218 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerDied","Data":"a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911"} Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.782231 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerDied","Data":"0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef"} Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.785659 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7163ac1c-74a2-4383-bb8b-21a320eb6863","Type":"ContainerStarted","Data":"8179fb265e4f83a60e3ca62714109aa5e9d0253757c808d971ed3165d6745dd9"} Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.788026 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747455c75f-8zrqn" event={"ID":"c3f6e95d-1b68-4641-a4e5-6bd172c0662e","Type":"ContainerStarted","Data":"50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646"} Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.788185 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.804787 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" podStartSLOduration=7.804735242 podStartE2EDuration="7.804735242s" podCreationTimestamp="2025-11-26 13:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:52.794712297 +0000 UTC m=+1281.473914135" watchObservedRunningTime="2025-11-26 13:51:52.804735242 +0000 UTC m=+1281.483937070" Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.838400 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-747455c75f-8zrqn" podStartSLOduration=8.838372421999999 podStartE2EDuration="8.838372422s" podCreationTimestamp="2025-11-26 13:51:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:52.818559508 +0000 UTC m=+1281.497761336" watchObservedRunningTime="2025-11-26 13:51:52.838372422 +0000 UTC m=+1281.517574240" Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.995654 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1247eec6-f811-4028-9e60-ee2e95526022" path="/var/lib/kubelet/pods/1247eec6-f811-4028-9e60-ee2e95526022/volumes" Nov 26 13:51:52 crc kubenswrapper[5200]: I1126 13:51:52.996656 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b758c5-ab01-4e10-98dd-b668f6f64d4a" path="/var/lib/kubelet/pods/51b758c5-ab01-4e10-98dd-b668f6f64d4a/volumes" Nov 26 13:51:53 crc kubenswrapper[5200]: I1126 13:51:53.799174 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ff2e56e0-4f2e-4101-b051-111cb2cbee56","Type":"ContainerStarted","Data":"6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb"} Nov 26 13:51:53 crc kubenswrapper[5200]: I1126 13:51:53.799384 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/cinder-api-0" Nov 26 13:51:53 crc kubenswrapper[5200]: I1126 13:51:53.803458 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7163ac1c-74a2-4383-bb8b-21a320eb6863","Type":"ContainerStarted","Data":"cda7517c7cfab2ef565f5c3fa5201af6cdade851c6625c180d2015174e06c39f"} Nov 26 13:51:53 crc kubenswrapper[5200]: I1126 13:51:53.803485 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7163ac1c-74a2-4383-bb8b-21a320eb6863","Type":"ContainerStarted","Data":"5bb653ca071dc9a61e5ecf99b1fe3a17c36b5dd2807279eee649a3530583f056"} Nov 26 13:51:53 crc kubenswrapper[5200]: I1126 13:51:53.846494 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.846467514 podStartE2EDuration="3.846467514s" podCreationTimestamp="2025-11-26 13:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:53.816997544 +0000 UTC m=+1282.496199372" watchObservedRunningTime="2025-11-26 13:51:53.846467514 +0000 UTC m=+1282.525669332" Nov 26 13:51:53 crc kubenswrapper[5200]: I1126 13:51:53.869504 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.869481062 podStartE2EDuration="2.869481062s" podCreationTimestamp="2025-11-26 13:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:51:53.860092374 +0000 UTC m=+1282.539294192" watchObservedRunningTime="2025-11-26 13:51:53.869481062 +0000 UTC m=+1282.548682870" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.259371 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.354787 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-sg-core-conf-yaml\") pod \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.355011 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-config-data\") pod \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.355037 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-scripts\") pod \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.355097 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-run-httpd\") pod \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.355121 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-log-httpd\") pod \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.355204 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpsjt\" (UniqueName: \"kubernetes.io/projected/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-kube-api-access-wpsjt\") pod \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.355308 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-combined-ca-bundle\") pod \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\" (UID: \"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4\") " Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.356112 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" (UID: "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.356265 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" (UID: "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.372966 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-scripts" (OuterVolumeSpecName: "scripts") pod "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" (UID: "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.374824 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-kube-api-access-wpsjt" (OuterVolumeSpecName: "kube-api-access-wpsjt") pod "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" (UID: "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4"). InnerVolumeSpecName "kube-api-access-wpsjt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.390500 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" (UID: "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.445561 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" (UID: "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.458020 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.458056 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.458068 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wpsjt\" (UniqueName: \"kubernetes.io/projected/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-kube-api-access-wpsjt\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.458077 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.458087 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.458096 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.498160 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-config-data" (OuterVolumeSpecName: "config-data") pod "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" (UID: "9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.560225 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.825182 5200 generic.go:358] "Generic (PLEG): container finished" podID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerID="ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c" exitCode=0 Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.826574 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.827140 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerDied","Data":"ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c"} Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.827185 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4","Type":"ContainerDied","Data":"2eec25b39db3b1e1e3b26ef78154f9c4728a68a4384476ce26c7dd8a211907bd"} Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.827208 5200 scope.go:117] "RemoveContainer" containerID="e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.854291 5200 scope.go:117] "RemoveContainer" containerID="a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.908053 5200 scope.go:117] "RemoveContainer" containerID="ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.912427 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.949592 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.961935 5200 scope.go:117] "RemoveContainer" containerID="0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.967871 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970248 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="proxy-httpd" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970276 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="proxy-httpd" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970298 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="ceilometer-central-agent" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970329 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="ceilometer-central-agent" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970351 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="ceilometer-notification-agent" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970360 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="ceilometer-notification-agent" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970444 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="sg-core" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970452 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="sg-core" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970889 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="sg-core" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970913 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="ceilometer-notification-agent" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970925 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="ceilometer-central-agent" Nov 26 13:51:54 crc kubenswrapper[5200]: I1126 13:51:54.970967 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" containerName="proxy-httpd" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.987789 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.988533 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.991944 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.992360 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.999235 5200 scope.go:117] "RemoveContainer" containerID="e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66" Nov 26 13:51:55 crc kubenswrapper[5200]: E1126 13:51:54.999569 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66\": container with ID starting with e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66 not found: ID does not exist" containerID="e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.999607 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66"} err="failed to get container status \"e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66\": rpc error: code = NotFound desc = could not find container \"e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66\": container with ID starting with e6cae260c635818fe0f31c1e282fb24915294160a983f0c5948c898f2d616b66 not found: ID does not exist" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.999634 5200 scope.go:117] "RemoveContainer" containerID="a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911" Nov 26 13:51:55 crc kubenswrapper[5200]: E1126 13:51:54.999864 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911\": container with ID starting with a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911 not found: ID does not exist" containerID="a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.999886 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911"} err="failed to get container status \"a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911\": rpc error: code = NotFound desc = could not find container \"a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911\": container with ID starting with a019130825bb94a9033125925eb399bdbb5576a34e6094897b79b6aa71043911 not found: ID does not exist" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:54.999904 5200 scope.go:117] "RemoveContainer" containerID="ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c" Nov 26 13:51:55 crc kubenswrapper[5200]: E1126 13:51:55.000153 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c\": container with ID starting with ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c not found: ID does not exist" containerID="ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.000176 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c"} err="failed to get container status \"ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c\": rpc error: code = NotFound desc = could not find container \"ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c\": container with ID starting with ee86f132f739198dba4b6e5eb08b6c52b9df3b51728a7ae46c0c9d200d9ed55c not found: ID does not exist" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.000193 5200 scope.go:117] "RemoveContainer" containerID="0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef" Nov 26 13:51:55 crc kubenswrapper[5200]: E1126 13:51:55.000493 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef\": container with ID starting with 0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef not found: ID does not exist" containerID="0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.000519 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef"} err="failed to get container status \"0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef\": rpc error: code = NotFound desc = could not find container \"0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef\": container with ID starting with 0679675f08eff8cc51fcbe68d7a4328a5a0f6d06842a8afd046a1f1624e688ef not found: ID does not exist" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.006435 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4" path="/var/lib/kubelet/pods/9d51f5b4-9537-4b72-b1c0-d7c26a94ffc4/volumes" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.075716 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-scripts\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.075777 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.076089 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghlw8\" (UniqueName: \"kubernetes.io/projected/90fc40a5-dec9-4784-80ce-04dfe978384d-kube-api-access-ghlw8\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.076613 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-config-data\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.076778 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-run-httpd\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.076802 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-log-httpd\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.076852 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.179869 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-run-httpd\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.179951 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-log-httpd\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.180002 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.181162 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-run-httpd\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.180069 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-scripts\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.181312 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.181330 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-log-httpd\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.181571 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghlw8\" (UniqueName: \"kubernetes.io/projected/90fc40a5-dec9-4784-80ce-04dfe978384d-kube-api-access-ghlw8\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.182065 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-config-data\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.187455 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-scripts\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.190771 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-config-data\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.191253 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.194274 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.212621 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghlw8\" (UniqueName: \"kubernetes.io/projected/90fc40a5-dec9-4784-80ce-04dfe978384d-kube-api-access-ghlw8\") pod \"ceilometer-0\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.318803 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:51:55 crc kubenswrapper[5200]: W1126 13:51:55.864514 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fc40a5_dec9_4784_80ce_04dfe978384d.slice/crio-7dee69e1ae895d0cd349baea8a317930212310c8a0ea153a1732dd66dbe38b76 WatchSource:0}: Error finding container 7dee69e1ae895d0cd349baea8a317930212310c8a0ea153a1732dd66dbe38b76: Status 404 returned error can't find the container with id 7dee69e1ae895d0cd349baea8a317930212310c8a0ea153a1732dd66dbe38b76 Nov 26 13:51:55 crc kubenswrapper[5200]: I1126 13:51:55.867280 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:51:56 crc kubenswrapper[5200]: I1126 13:51:56.483338 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 26 13:51:56 crc kubenswrapper[5200]: I1126 13:51:56.866541 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerStarted","Data":"f077b3a0c10f4da8a9a517f4327afa2653e472b4f95aa2718b566eb6d6475ca3"} Nov 26 13:51:56 crc kubenswrapper[5200]: I1126 13:51:56.867054 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerStarted","Data":"7dee69e1ae895d0cd349baea8a317930212310c8a0ea153a1732dd66dbe38b76"} Nov 26 13:51:57 crc kubenswrapper[5200]: I1126 13:51:57.888212 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerStarted","Data":"a2880ab14a768d7e3852fc5bfb65aaaa88fb10a918edb3f58b7cad6703738b92"} Nov 26 13:51:58 crc kubenswrapper[5200]: I1126 13:51:58.903096 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerStarted","Data":"ddb3e211afc79419f0801715dc70052c8c7ac5601c22e91ffeecda26d7bfa4a9"} Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.311184 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.370710 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.438277 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dcbdf5458-6txbv"] Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.438574 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-api-7dcbdf5458-6txbv" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api-log" containerID="cri-o://623320551b9ec677d7c7026f6bc7b35f9e5ebb9cfb04aaf275e60244c0b0ecd1" gracePeriod=30 Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.439094 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-api-7dcbdf5458-6txbv" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api" containerID="cri-o://a1dea5d541937c30fc574b8d069744ef95d747ea3e03dc28c29d3c404bef2735" gracePeriod=30 Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.925021 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerStarted","Data":"64c617d17740dc682e5429c752f5ce7e81fb0f548db76d2cce0ede8ba83038d7"} Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.925203 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.927861 5200 generic.go:358] "Generic (PLEG): container finished" podID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerID="623320551b9ec677d7c7026f6bc7b35f9e5ebb9cfb04aaf275e60244c0b0ecd1" exitCode=143 Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.928051 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcbdf5458-6txbv" event={"ID":"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392","Type":"ContainerDied","Data":"623320551b9ec677d7c7026f6bc7b35f9e5ebb9cfb04aaf275e60244c0b0ecd1"} Nov 26 13:52:00 crc kubenswrapper[5200]: I1126 13:52:00.970642 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.809865246 podStartE2EDuration="6.970615876s" podCreationTimestamp="2025-11-26 13:51:54 +0000 UTC" firstStartedPulling="2025-11-26 13:51:55.866468986 +0000 UTC m=+1284.545670824" lastFinishedPulling="2025-11-26 13:52:00.027219596 +0000 UTC m=+1288.706421454" observedRunningTime="2025-11-26 13:52:00.959449621 +0000 UTC m=+1289.638651449" watchObservedRunningTime="2025-11-26 13:52:00.970615876 +0000 UTC m=+1289.649817704" Nov 26 13:52:01 crc kubenswrapper[5200]: I1126 13:52:01.744830 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 26 13:52:02 crc kubenswrapper[5200]: I1126 13:52:02.569437 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 26 13:52:03 crc kubenswrapper[5200]: I1126 13:52:03.619674 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dcbdf5458-6txbv" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:56004->10.217.0.154:9311: read: connection reset by peer" Nov 26 13:52:03 crc kubenswrapper[5200]: I1126 13:52:03.962579 5200 generic.go:358] "Generic (PLEG): container finished" podID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerID="a1dea5d541937c30fc574b8d069744ef95d747ea3e03dc28c29d3c404bef2735" exitCode=0 Nov 26 13:52:03 crc kubenswrapper[5200]: I1126 13:52:03.962714 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcbdf5458-6txbv" event={"ID":"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392","Type":"ContainerDied","Data":"a1dea5d541937c30fc574b8d069744ef95d747ea3e03dc28c29d3c404bef2735"} Nov 26 13:52:05 crc kubenswrapper[5200]: I1126 13:52:05.210491 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dcbdf5458-6txbv" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": dial tcp 10.217.0.154:9311: connect: connection refused" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.694276 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.766528 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-combined-ca-bundle\") pod \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.767036 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data\") pod \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.767296 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data-custom\") pod \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.767375 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzh8j\" (UniqueName: \"kubernetes.io/projected/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-kube-api-access-vzh8j\") pod \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.767465 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-logs\") pod \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\" (UID: \"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392\") " Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.768450 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-logs" (OuterVolumeSpecName: "logs") pod "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" (UID: "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.774262 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-kube-api-access-vzh8j" (OuterVolumeSpecName: "kube-api-access-vzh8j") pod "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" (UID: "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392"). InnerVolumeSpecName "kube-api-access-vzh8j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.789084 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" (UID: "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.800766 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" (UID: "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.821109 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data" (OuterVolumeSpecName: "config-data") pod "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" (UID: "9a16cfaf-3944-4dce-bb5d-9e6bce2ed392"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.832288 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.869568 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.869614 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.869625 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.869635 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vzh8j\" (UniqueName: \"kubernetes.io/projected/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-kube-api-access-vzh8j\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:06 crc kubenswrapper[5200]: I1126 13:52:06.869647 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:07 crc kubenswrapper[5200]: I1126 13:52:07.006100 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcbdf5458-6txbv" Nov 26 13:52:07 crc kubenswrapper[5200]: I1126 13:52:07.010487 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcbdf5458-6txbv" event={"ID":"9a16cfaf-3944-4dce-bb5d-9e6bce2ed392","Type":"ContainerDied","Data":"e4ee3eb159d2775c7e423a56ef95b5613486eb92814f5f89bdb132cc52a411d5"} Nov 26 13:52:07 crc kubenswrapper[5200]: I1126 13:52:07.010548 5200 scope.go:117] "RemoveContainer" containerID="a1dea5d541937c30fc574b8d069744ef95d747ea3e03dc28c29d3c404bef2735" Nov 26 13:52:07 crc kubenswrapper[5200]: I1126 13:52:07.072088 5200 scope.go:117] "RemoveContainer" containerID="623320551b9ec677d7c7026f6bc7b35f9e5ebb9cfb04aaf275e60244c0b0ecd1" Nov 26 13:52:07 crc kubenswrapper[5200]: I1126 13:52:07.083052 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dcbdf5458-6txbv"] Nov 26 13:52:07 crc kubenswrapper[5200]: I1126 13:52:07.112055 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7dcbdf5458-6txbv"] Nov 26 13:52:07 crc kubenswrapper[5200]: I1126 13:52:07.263538 5200 pod_container_manager_linux.go:217] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod4ce2de04-4f6b-49fa-9946-5ef9663e95e8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod4ce2de04-4f6b-49fa-9946-5ef9663e95e8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4ce2de04_4f6b_49fa_9946_5ef9663e95e8.slice" Nov 26 13:52:07 crc kubenswrapper[5200]: E1126 13:52:07.263617 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod4ce2de04-4f6b-49fa-9946-5ef9663e95e8] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod4ce2de04-4f6b-49fa-9946-5ef9663e95e8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4ce2de04_4f6b_49fa_9946_5ef9663e95e8.slice" pod="openstack/neutron-db-sync-vk528" podUID="4ce2de04-4f6b-49fa-9946-5ef9663e95e8" Nov 26 13:52:07 crc kubenswrapper[5200]: I1126 13:52:07.616054 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:52:08 crc kubenswrapper[5200]: I1126 13:52:08.018205 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vk528" Nov 26 13:52:08 crc kubenswrapper[5200]: I1126 13:52:08.979964 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" path="/var/lib/kubelet/pods/9a16cfaf-3944-4dce-bb5d-9e6bce2ed392/volumes" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.579554 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.581513 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api-log" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.581552 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api-log" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.581569 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.581577 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.581923 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api-log" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.581958 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a16cfaf-3944-4dce-bb5d-9e6bce2ed392" containerName="barbican-api" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.591346 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.595505 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-config\"" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.595507 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-config-secret\"" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.597780 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstackclient-openstackclient-dockercfg-wgr6j\"" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.602809 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.657949 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.658496 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq87g\" (UniqueName: \"kubernetes.io/projected/b8c23725-1545-4d07-8fc3-a4dd066a685e-kube-api-access-nq87g\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.658562 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.658587 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.760716 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.760785 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.760909 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.761067 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq87g\" (UniqueName: \"kubernetes.io/projected/b8c23725-1545-4d07-8fc3-a4dd066a685e-kube-api-access-nq87g\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.761770 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.773117 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.773570 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.795917 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq87g\" (UniqueName: \"kubernetes.io/projected/b8c23725-1545-4d07-8fc3-a4dd066a685e-kube-api-access-nq87g\") pod \"openstackclient\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " pod="openstack/openstackclient" Nov 26 13:52:09 crc kubenswrapper[5200]: I1126 13:52:09.933890 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 13:52:10 crc kubenswrapper[5200]: I1126 13:52:10.513598 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 13:52:11 crc kubenswrapper[5200]: I1126 13:52:11.053819 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b8c23725-1545-4d07-8fc3-a4dd066a685e","Type":"ContainerStarted","Data":"d33650bab1edb0ce08fdc6576a4fff28d386da258a2d12518fb1e234b7403aad"} Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.406329 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.565544 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-75cc5c4f79-dfmjl"] Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.581299 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.587571 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-swift-public-svc\"" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.588129 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-swift-internal-svc\"" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.588371 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"swift-proxy-config-data\"" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.599776 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75cc5c4f79-dfmjl"] Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.649954 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-internal-tls-certs\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.650016 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-log-httpd\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.650165 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-combined-ca-bundle\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.650455 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-run-httpd\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.650540 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-public-tls-certs\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.650616 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-config-data\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.650961 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-etc-swift\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.651026 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6kkg\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-kube-api-access-z6kkg\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.753285 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-combined-ca-bundle\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.753709 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-run-httpd\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.753734 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-public-tls-certs\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.753760 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-config-data\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.753838 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-etc-swift\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.753863 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6kkg\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-kube-api-access-z6kkg\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.753914 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-internal-tls-certs\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.753935 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-log-httpd\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.754611 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-log-httpd\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.754667 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-run-httpd\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.761257 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-internal-tls-certs\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.761948 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-etc-swift\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.762669 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-combined-ca-bundle\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.762761 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-config-data\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.764086 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-public-tls-certs\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.773506 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6kkg\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-kube-api-access-z6kkg\") pod \"swift-proxy-75cc5c4f79-dfmjl\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:13 crc kubenswrapper[5200]: I1126 13:52:13.912769 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:14 crc kubenswrapper[5200]: I1126 13:52:14.563358 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75cc5c4f79-dfmjl"] Nov 26 13:52:14 crc kubenswrapper[5200]: W1126 13:52:14.576086 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0772c27_b753_402f_aded_0d4050bc80bd.slice/crio-03070f3a8b2aa95508e55cda4d5a50a20d3d05ca5965aa8664bf38504e8b2d2b WatchSource:0}: Error finding container 03070f3a8b2aa95508e55cda4d5a50a20d3d05ca5965aa8664bf38504e8b2d2b: Status 404 returned error can't find the container with id 03070f3a8b2aa95508e55cda4d5a50a20d3d05ca5965aa8664bf38504e8b2d2b Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.009935 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.010569 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-log" containerID="cri-o://083601691bb0bde35bfac41a3c731f77bedc5b235a717cdf92b5a317fed27f1a" gracePeriod=30 Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.011019 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-httpd" containerID="cri-o://d07b1c117ffd009ab064a77d5433bc202a99c4a7faeb7d1eb6420d58dd9b017f" gracePeriod=30 Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.124321 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" event={"ID":"b0772c27-b753-402f-aded-0d4050bc80bd","Type":"ContainerStarted","Data":"3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9"} Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.124394 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" event={"ID":"b0772c27-b753-402f-aded-0d4050bc80bd","Type":"ContainerStarted","Data":"03070f3a8b2aa95508e55cda4d5a50a20d3d05ca5965aa8664bf38504e8b2d2b"} Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.777937 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.779929 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="ceilometer-central-agent" containerID="cri-o://f077b3a0c10f4da8a9a517f4327afa2653e472b4f95aa2718b566eb6d6475ca3" gracePeriod=30 Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.779997 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="sg-core" containerID="cri-o://ddb3e211afc79419f0801715dc70052c8c7ac5601c22e91ffeecda26d7bfa4a9" gracePeriod=30 Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.780011 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="ceilometer-notification-agent" containerID="cri-o://a2880ab14a768d7e3852fc5bfb65aaaa88fb10a918edb3f58b7cad6703738b92" gracePeriod=30 Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.780044 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="proxy-httpd" containerID="cri-o://64c617d17740dc682e5429c752f5ce7e81fb0f548db76d2cce0ede8ba83038d7" gracePeriod=30 Nov 26 13:52:15 crc kubenswrapper[5200]: I1126 13:52:15.788156 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 13:52:16 crc kubenswrapper[5200]: I1126 13:52:16.145027 5200 generic.go:358] "Generic (PLEG): container finished" podID="29258dfe-0805-481f-ad5f-a133bd88517e" containerID="083601691bb0bde35bfac41a3c731f77bedc5b235a717cdf92b5a317fed27f1a" exitCode=143 Nov 26 13:52:16 crc kubenswrapper[5200]: I1126 13:52:16.145097 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29258dfe-0805-481f-ad5f-a133bd88517e","Type":"ContainerDied","Data":"083601691bb0bde35bfac41a3c731f77bedc5b235a717cdf92b5a317fed27f1a"} Nov 26 13:52:16 crc kubenswrapper[5200]: I1126 13:52:16.150044 5200 generic.go:358] "Generic (PLEG): container finished" podID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerID="ddb3e211afc79419f0801715dc70052c8c7ac5601c22e91ffeecda26d7bfa4a9" exitCode=2 Nov 26 13:52:16 crc kubenswrapper[5200]: I1126 13:52:16.150120 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerDied","Data":"ddb3e211afc79419f0801715dc70052c8c7ac5601c22e91ffeecda26d7bfa4a9"} Nov 26 13:52:17 crc kubenswrapper[5200]: I1126 13:52:17.169858 5200 generic.go:358] "Generic (PLEG): container finished" podID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerID="64c617d17740dc682e5429c752f5ce7e81fb0f548db76d2cce0ede8ba83038d7" exitCode=0 Nov 26 13:52:17 crc kubenswrapper[5200]: I1126 13:52:17.170405 5200 generic.go:358] "Generic (PLEG): container finished" podID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerID="f077b3a0c10f4da8a9a517f4327afa2653e472b4f95aa2718b566eb6d6475ca3" exitCode=0 Nov 26 13:52:17 crc kubenswrapper[5200]: I1126 13:52:17.171233 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerDied","Data":"64c617d17740dc682e5429c752f5ce7e81fb0f548db76d2cce0ede8ba83038d7"} Nov 26 13:52:17 crc kubenswrapper[5200]: I1126 13:52:17.171285 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerDied","Data":"f077b3a0c10f4da8a9a517f4327afa2653e472b4f95aa2718b566eb6d6475ca3"} Nov 26 13:52:17 crc kubenswrapper[5200]: I1126 13:52:17.202223 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:52:17 crc kubenswrapper[5200]: I1126 13:52:17.202555 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerName="glance-log" containerID="cri-o://0a25d3f0874f9fafd0dd6fc7dee32b46e0f09a9d25839443eb7f11f641f50a5b" gracePeriod=30 Nov 26 13:52:17 crc kubenswrapper[5200]: I1126 13:52:17.204659 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerName="glance-httpd" containerID="cri-o://57bd9432cfc7e5ffb3b2f1f725e50337c201944249329a6b2edaf1076f1810ab" gracePeriod=30 Nov 26 13:52:18 crc kubenswrapper[5200]: I1126 13:52:18.156199 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": read tcp 10.217.0.2:47146->10.217.0.150:9292: read: connection reset by peer" Nov 26 13:52:18 crc kubenswrapper[5200]: I1126 13:52:18.215009 5200 generic.go:358] "Generic (PLEG): container finished" podID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerID="0a25d3f0874f9fafd0dd6fc7dee32b46e0f09a9d25839443eb7f11f641f50a5b" exitCode=143 Nov 26 13:52:18 crc kubenswrapper[5200]: I1126 13:52:18.215115 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"baadd4af-f958-45eb-ae2f-eb5ecead58a7","Type":"ContainerDied","Data":"0a25d3f0874f9fafd0dd6fc7dee32b46e0f09a9d25839443eb7f11f641f50a5b"} Nov 26 13:52:19 crc kubenswrapper[5200]: I1126 13:52:19.245269 5200 generic.go:358] "Generic (PLEG): container finished" podID="29258dfe-0805-481f-ad5f-a133bd88517e" containerID="d07b1c117ffd009ab064a77d5433bc202a99c4a7faeb7d1eb6420d58dd9b017f" exitCode=0 Nov 26 13:52:19 crc kubenswrapper[5200]: I1126 13:52:19.245378 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29258dfe-0805-481f-ad5f-a133bd88517e","Type":"ContainerDied","Data":"d07b1c117ffd009ab064a77d5433bc202a99c4a7faeb7d1eb6420d58dd9b017f"} Nov 26 13:52:20 crc kubenswrapper[5200]: I1126 13:52:20.264516 5200 generic.go:358] "Generic (PLEG): container finished" podID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerID="a2880ab14a768d7e3852fc5bfb65aaaa88fb10a918edb3f58b7cad6703738b92" exitCode=0 Nov 26 13:52:20 crc kubenswrapper[5200]: I1126 13:52:20.264600 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerDied","Data":"a2880ab14a768d7e3852fc5bfb65aaaa88fb10a918edb3f58b7cad6703738b92"} Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.316755 5200 generic.go:358] "Generic (PLEG): container finished" podID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerID="57bd9432cfc7e5ffb3b2f1f725e50337c201944249329a6b2edaf1076f1810ab" exitCode=0 Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.316853 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"baadd4af-f958-45eb-ae2f-eb5ecead58a7","Type":"ContainerDied","Data":"57bd9432cfc7e5ffb3b2f1f725e50337c201944249329a6b2edaf1076f1810ab"} Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.627428 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.689889 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.722205 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.738437 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-httpd-run\") pod \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.738524 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-logs\") pod \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.738559 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-combined-ca-bundle\") pod \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.738592 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghlw8\" (UniqueName: \"kubernetes.io/projected/90fc40a5-dec9-4784-80ce-04dfe978384d-kube-api-access-ghlw8\") pod \"90fc40a5-dec9-4784-80ce-04dfe978384d\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.738759 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-config-data\") pod \"90fc40a5-dec9-4784-80ce-04dfe978384d\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.738789 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-combined-ca-bundle\") pod \"90fc40a5-dec9-4784-80ce-04dfe978384d\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.738937 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-scripts\") pod \"90fc40a5-dec9-4784-80ce-04dfe978384d\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.739066 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-log-httpd\") pod \"90fc40a5-dec9-4784-80ce-04dfe978384d\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.739094 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-internal-tls-certs\") pod \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.739113 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-scripts\") pod \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.739201 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.739224 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-run-httpd\") pod \"90fc40a5-dec9-4784-80ce-04dfe978384d\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.739290 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-sg-core-conf-yaml\") pod \"90fc40a5-dec9-4784-80ce-04dfe978384d\" (UID: \"90fc40a5-dec9-4784-80ce-04dfe978384d\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.739313 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfn8z\" (UniqueName: \"kubernetes.io/projected/baadd4af-f958-45eb-ae2f-eb5ecead58a7-kube-api-access-mfn8z\") pod \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.739334 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-config-data\") pod \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\" (UID: \"baadd4af-f958-45eb-ae2f-eb5ecead58a7\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.743384 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "90fc40a5-dec9-4784-80ce-04dfe978384d" (UID: "90fc40a5-dec9-4784-80ce-04dfe978384d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.748871 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "baadd4af-f958-45eb-ae2f-eb5ecead58a7" (UID: "baadd4af-f958-45eb-ae2f-eb5ecead58a7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.749150 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "90fc40a5-dec9-4784-80ce-04dfe978384d" (UID: "90fc40a5-dec9-4784-80ce-04dfe978384d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.749207 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-logs" (OuterVolumeSpecName: "logs") pod "baadd4af-f958-45eb-ae2f-eb5ecead58a7" (UID: "baadd4af-f958-45eb-ae2f-eb5ecead58a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.751193 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-scripts" (OuterVolumeSpecName: "scripts") pod "baadd4af-f958-45eb-ae2f-eb5ecead58a7" (UID: "baadd4af-f958-45eb-ae2f-eb5ecead58a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.754072 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fc40a5-dec9-4784-80ce-04dfe978384d-kube-api-access-ghlw8" (OuterVolumeSpecName: "kube-api-access-ghlw8") pod "90fc40a5-dec9-4784-80ce-04dfe978384d" (UID: "90fc40a5-dec9-4784-80ce-04dfe978384d"). InnerVolumeSpecName "kube-api-access-ghlw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.754843 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "baadd4af-f958-45eb-ae2f-eb5ecead58a7" (UID: "baadd4af-f958-45eb-ae2f-eb5ecead58a7"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.760648 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baadd4af-f958-45eb-ae2f-eb5ecead58a7-kube-api-access-mfn8z" (OuterVolumeSpecName: "kube-api-access-mfn8z") pod "baadd4af-f958-45eb-ae2f-eb5ecead58a7" (UID: "baadd4af-f958-45eb-ae2f-eb5ecead58a7"). InnerVolumeSpecName "kube-api-access-mfn8z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.761166 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-scripts" (OuterVolumeSpecName: "scripts") pod "90fc40a5-dec9-4784-80ce-04dfe978384d" (UID: "90fc40a5-dec9-4784-80ce-04dfe978384d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.786065 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "baadd4af-f958-45eb-ae2f-eb5ecead58a7" (UID: "baadd4af-f958-45eb-ae2f-eb5ecead58a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.841059 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4ljq\" (UniqueName: \"kubernetes.io/projected/29258dfe-0805-481f-ad5f-a133bd88517e-kube-api-access-z4ljq\") pod \"29258dfe-0805-481f-ad5f-a133bd88517e\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.841134 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"29258dfe-0805-481f-ad5f-a133bd88517e\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.841203 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-httpd-run\") pod \"29258dfe-0805-481f-ad5f-a133bd88517e\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.841315 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-combined-ca-bundle\") pod \"29258dfe-0805-481f-ad5f-a133bd88517e\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.841421 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-config-data\") pod \"29258dfe-0805-481f-ad5f-a133bd88517e\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.841450 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-scripts\") pod \"29258dfe-0805-481f-ad5f-a133bd88517e\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.841536 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-logs\") pod \"29258dfe-0805-481f-ad5f-a133bd88517e\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.841582 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-public-tls-certs\") pod \"29258dfe-0805-481f-ad5f-a133bd88517e\" (UID: \"29258dfe-0805-481f-ad5f-a133bd88517e\") " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842523 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842541 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842561 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842570 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90fc40a5-dec9-4784-80ce-04dfe978384d-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842580 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfn8z\" (UniqueName: \"kubernetes.io/projected/baadd4af-f958-45eb-ae2f-eb5ecead58a7-kube-api-access-mfn8z\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842591 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842600 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baadd4af-f958-45eb-ae2f-eb5ecead58a7-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842608 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842617 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghlw8\" (UniqueName: \"kubernetes.io/projected/90fc40a5-dec9-4784-80ce-04dfe978384d-kube-api-access-ghlw8\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.842628 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.843381 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "29258dfe-0805-481f-ad5f-a133bd88517e" (UID: "29258dfe-0805-481f-ad5f-a133bd88517e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.845314 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-logs" (OuterVolumeSpecName: "logs") pod "29258dfe-0805-481f-ad5f-a133bd88517e" (UID: "29258dfe-0805-481f-ad5f-a133bd88517e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.857453 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "29258dfe-0805-481f-ad5f-a133bd88517e" (UID: "29258dfe-0805-481f-ad5f-a133bd88517e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.858065 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29258dfe-0805-481f-ad5f-a133bd88517e-kube-api-access-z4ljq" (OuterVolumeSpecName: "kube-api-access-z4ljq") pod "29258dfe-0805-481f-ad5f-a133bd88517e" (UID: "29258dfe-0805-481f-ad5f-a133bd88517e"). InnerVolumeSpecName "kube-api-access-z4ljq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.863433 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "90fc40a5-dec9-4784-80ce-04dfe978384d" (UID: "90fc40a5-dec9-4784-80ce-04dfe978384d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.864094 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-scripts" (OuterVolumeSpecName: "scripts") pod "29258dfe-0805-481f-ad5f-a133bd88517e" (UID: "29258dfe-0805-481f-ad5f-a133bd88517e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.897461 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90fc40a5-dec9-4784-80ce-04dfe978384d" (UID: "90fc40a5-dec9-4784-80ce-04dfe978384d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.903147 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-config-data" (OuterVolumeSpecName: "config-data") pod "baadd4af-f958-45eb-ae2f-eb5ecead58a7" (UID: "baadd4af-f958-45eb-ae2f-eb5ecead58a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.931881 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29258dfe-0805-481f-ad5f-a133bd88517e" (UID: "29258dfe-0805-481f-ad5f-a133bd88517e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.944983 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.945014 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.945025 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4ljq\" (UniqueName: \"kubernetes.io/projected/29258dfe-0805-481f-ad5f-a133bd88517e-kube-api-access-z4ljq\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.945058 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.945067 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.945076 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.945084 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.945092 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.945103 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29258dfe-0805-481f-ad5f-a133bd88517e-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.958334 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.985889 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "baadd4af-f958-45eb-ae2f-eb5ecead58a7" (UID: "baadd4af-f958-45eb-ae2f-eb5ecead58a7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:21 crc kubenswrapper[5200]: I1126 13:52:21.993975 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.001415 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-config-data" (OuterVolumeSpecName: "config-data") pod "29258dfe-0805-481f-ad5f-a133bd88517e" (UID: "29258dfe-0805-481f-ad5f-a133bd88517e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.047591 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.048500 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baadd4af-f958-45eb-ae2f-eb5ecead58a7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.048515 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.048530 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.060048 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "29258dfe-0805-481f-ad5f-a133bd88517e" (UID: "29258dfe-0805-481f-ad5f-a133bd88517e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.070014 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-config-data" (OuterVolumeSpecName: "config-data") pod "90fc40a5-dec9-4784-80ce-04dfe978384d" (UID: "90fc40a5-dec9-4784-80ce-04dfe978384d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.150585 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29258dfe-0805-481f-ad5f-a133bd88517e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.150625 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fc40a5-dec9-4784-80ce-04dfe978384d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.333785 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90fc40a5-dec9-4784-80ce-04dfe978384d","Type":"ContainerDied","Data":"7dee69e1ae895d0cd349baea8a317930212310c8a0ea153a1732dd66dbe38b76"} Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.333847 5200 scope.go:117] "RemoveContainer" containerID="64c617d17740dc682e5429c752f5ce7e81fb0f548db76d2cce0ede8ba83038d7" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.334017 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.342336 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" event={"ID":"b0772c27-b753-402f-aded-0d4050bc80bd","Type":"ContainerStarted","Data":"878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080"} Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.342458 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.342751 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.347045 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b8c23725-1545-4d07-8fc3-a4dd066a685e","Type":"ContainerStarted","Data":"819c90649cfd578627699cf1864ff703ef1f85ca28a43cafd1731fbe6e832be9"} Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.356453 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"29258dfe-0805-481f-ad5f-a133bd88517e","Type":"ContainerDied","Data":"deca4b24c051fb0576f663ee2c2e7bf8369138e851edcd1b8f3ffffb553091c8"} Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.356614 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.359126 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.360095 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"baadd4af-f958-45eb-ae2f-eb5ecead58a7","Type":"ContainerDied","Data":"32ee7c0e1f52bf2f26996fda9dc185111c4c04dc761d14107ab92f62f996ed02"} Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.360228 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.372435 5200 scope.go:117] "RemoveContainer" containerID="ddb3e211afc79419f0801715dc70052c8c7ac5601c22e91ffeecda26d7bfa4a9" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.374551 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" podStartSLOduration=9.374530753 podStartE2EDuration="9.374530753s" podCreationTimestamp="2025-11-26 13:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:52:22.371593623 +0000 UTC m=+1311.050795451" watchObservedRunningTime="2025-11-26 13:52:22.374530753 +0000 UTC m=+1311.053732571" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.394559 5200 scope.go:117] "RemoveContainer" containerID="a2880ab14a768d7e3852fc5bfb65aaaa88fb10a918edb3f58b7cad6703738b92" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.405213 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.417547 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.434916 5200 scope.go:117] "RemoveContainer" containerID="f077b3a0c10f4da8a9a517f4327afa2653e472b4f95aa2718b566eb6d6475ca3" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.436454 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437671 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="ceilometer-notification-agent" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437702 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="ceilometer-notification-agent" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437715 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="proxy-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437722 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="proxy-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437739 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="ceilometer-central-agent" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437747 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="ceilometer-central-agent" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437770 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerName="glance-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437776 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerName="glance-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437790 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-log" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437796 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-log" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437809 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437816 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437831 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerName="glance-log" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437837 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerName="glance-log" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437859 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="sg-core" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.437864 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="sg-core" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.438041 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="sg-core" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.438051 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-log" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.438059 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="ceilometer-notification-agent" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.438068 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" containerName="glance-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.438079 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerName="glance-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.438095 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="ceilometer-central-agent" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.438102 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" containerName="glance-log" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.438110 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" containerName="proxy-httpd" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.451406 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.454320 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.454783 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.464035 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.464054 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.710359298 podStartE2EDuration="13.464025936s" podCreationTimestamp="2025-11-26 13:52:09 +0000 UTC" firstStartedPulling="2025-11-26 13:52:10.512410423 +0000 UTC m=+1299.191612241" lastFinishedPulling="2025-11-26 13:52:21.266077061 +0000 UTC m=+1309.945278879" observedRunningTime="2025-11-26 13:52:22.426002966 +0000 UTC m=+1311.105204784" watchObservedRunningTime="2025-11-26 13:52:22.464025936 +0000 UTC m=+1311.143227754" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.498168 5200 scope.go:117] "RemoveContainer" containerID="d07b1c117ffd009ab064a77d5433bc202a99c4a7faeb7d1eb6420d58dd9b017f" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.507740 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.518192 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.546576 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.552713 5200 scope.go:117] "RemoveContainer" containerID="083601691bb0bde35bfac41a3c731f77bedc5b235a717cdf92b5a317fed27f1a" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.569935 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.570429 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-run-httpd\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.570497 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.570582 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvcht\" (UniqueName: \"kubernetes.io/projected/2bf616d2-17fd-4cff-985e-605babde3455-kube-api-access-jvcht\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.570632 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-scripts\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.570795 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-config-data\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.570834 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-log-httpd\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.570874 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.589806 5200 scope.go:117] "RemoveContainer" containerID="57bd9432cfc7e5ffb3b2f1f725e50337c201944249329a6b2edaf1076f1810ab" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.591176 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.606924 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.609841 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-q4wxh\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.610090 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-internal-svc\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.611243 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-scripts\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.611447 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.615721 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.618639 5200 scope.go:117] "RemoveContainer" containerID="0a25d3f0874f9fafd0dd6fc7dee32b46e0f09a9d25839443eb7f11f641f50a5b" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.625533 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.631574 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-public-svc\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.631703 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.634184 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.644312 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.683347 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.684275 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.684395 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-run-httpd\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.684480 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.684656 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhl87\" (UniqueName: \"kubernetes.io/projected/046ac75d-4cfe-40ed-aca5-a5875ea641d8-kube-api-access-jhl87\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.684810 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvcht\" (UniqueName: \"kubernetes.io/projected/2bf616d2-17fd-4cff-985e-605babde3455-kube-api-access-jvcht\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.684914 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.685033 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.685134 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-scripts\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.685220 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.685359 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.685755 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.685912 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-config-data\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.686019 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.686232 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-log-httpd\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.687527 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-log-httpd\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.694966 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-run-httpd\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.698932 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-config-data\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.702183 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvcht\" (UniqueName: \"kubernetes.io/projected/2bf616d2-17fd-4cff-985e-605babde3455-kube-api-access-jvcht\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.702240 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-scripts\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.705117 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.711825 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788096 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788167 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788244 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788296 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788334 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788360 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788384 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788454 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj949\" (UniqueName: \"kubernetes.io/projected/4bdf37de-cbcd-4971-a688-f3b91cf98b47-kube-api-access-hj949\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788495 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-logs\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788530 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhl87\" (UniqueName: \"kubernetes.io/projected/046ac75d-4cfe-40ed-aca5-a5875ea641d8-kube-api-access-jhl87\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788569 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788605 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788645 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788676 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788751 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.788777 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.789496 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.791361 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.792433 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.793777 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.794217 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.794330 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.795658 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.807536 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.816498 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhl87\" (UniqueName: \"kubernetes.io/projected/046ac75d-4cfe-40ed-aca5-a5875ea641d8-kube-api-access-jhl87\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.863698 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891195 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891270 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj949\" (UniqueName: \"kubernetes.io/projected/4bdf37de-cbcd-4971-a688-f3b91cf98b47-kube-api-access-hj949\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891301 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-logs\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891335 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891384 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891430 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891453 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891496 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.891639 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.896233 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.897348 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-logs\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.900156 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.910571 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.915175 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.915419 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.930318 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj949\" (UniqueName: \"kubernetes.io/projected/4bdf37de-cbcd-4971-a688-f3b91cf98b47-kube-api-access-hj949\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.952409 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.955074 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 26 13:52:22 crc kubenswrapper[5200]: I1126 13:52:22.955394 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " pod="openstack/glance-default-external-api-0" Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.026092 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29258dfe-0805-481f-ad5f-a133bd88517e" path="/var/lib/kubelet/pods/29258dfe-0805-481f-ad5f-a133bd88517e/volumes" Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.027553 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90fc40a5-dec9-4784-80ce-04dfe978384d" path="/var/lib/kubelet/pods/90fc40a5-dec9-4784-80ce-04dfe978384d/volumes" Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.030765 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baadd4af-f958-45eb-ae2f-eb5ecead58a7" path="/var/lib/kubelet/pods/baadd4af-f958-45eb-ae2f-eb5ecead58a7/volumes" Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.254573 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.338769 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.507220 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.728885 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:52:23 crc kubenswrapper[5200]: W1126 13:52:23.736023 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod046ac75d_4cfe_40ed_aca5_a5875ea641d8.slice/crio-3d48004db4d106cd8c7135bcd815a52ade65c2d73e608a8d23d85826804fd9de WatchSource:0}: Error finding container 3d48004db4d106cd8c7135bcd815a52ade65c2d73e608a8d23d85826804fd9de: Status 404 returned error can't find the container with id 3d48004db4d106cd8c7135bcd815a52ade65c2d73e608a8d23d85826804fd9de Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.811798 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.912006 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5777f78f56-8db7l"] Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.916930 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/neutron-5777f78f56-8db7l" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" containerName="neutron-api" containerID="cri-o://2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a" gracePeriod=30 Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.917137 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/neutron-5777f78f56-8db7l" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" containerName="neutron-httpd" containerID="cri-o://cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7" gracePeriod=30 Nov 26 13:52:23 crc kubenswrapper[5200]: I1126 13:52:23.945949 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:52:23 crc kubenswrapper[5200]: W1126 13:52:23.957921 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bdf37de_cbcd_4971_a688_f3b91cf98b47.slice/crio-ee433c892a72ac2b116b848168af197a5ddf19a92a364f003343a68e3b2aab04 WatchSource:0}: Error finding container ee433c892a72ac2b116b848168af197a5ddf19a92a364f003343a68e3b2aab04: Status 404 returned error can't find the container with id ee433c892a72ac2b116b848168af197a5ddf19a92a364f003343a68e3b2aab04 Nov 26 13:52:24 crc kubenswrapper[5200]: I1126 13:52:24.488200 5200 generic.go:358] "Generic (PLEG): container finished" podID="0effc02a-2b10-417e-b686-635f9578dfb7" containerID="cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7" exitCode=0 Nov 26 13:52:24 crc kubenswrapper[5200]: I1126 13:52:24.488810 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5777f78f56-8db7l" event={"ID":"0effc02a-2b10-417e-b686-635f9578dfb7","Type":"ContainerDied","Data":"cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7"} Nov 26 13:52:24 crc kubenswrapper[5200]: I1126 13:52:24.495375 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"046ac75d-4cfe-40ed-aca5-a5875ea641d8","Type":"ContainerStarted","Data":"3d48004db4d106cd8c7135bcd815a52ade65c2d73e608a8d23d85826804fd9de"} Nov 26 13:52:24 crc kubenswrapper[5200]: I1126 13:52:24.499616 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerStarted","Data":"5dcc19dabc9f53d2d4c85021f0414aae32d2345ad7d93386628d6a76cf63a7e1"} Nov 26 13:52:24 crc kubenswrapper[5200]: I1126 13:52:24.517049 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4bdf37de-cbcd-4971-a688-f3b91cf98b47","Type":"ContainerStarted","Data":"ee433c892a72ac2b116b848168af197a5ddf19a92a364f003343a68e3b2aab04"} Nov 26 13:52:25 crc kubenswrapper[5200]: I1126 13:52:25.547543 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"046ac75d-4cfe-40ed-aca5-a5875ea641d8","Type":"ContainerStarted","Data":"7503a8964c681bf49d84540f8809922bb7d2d71d2f4e4f1b1afea676f29d0da7"} Nov 26 13:52:25 crc kubenswrapper[5200]: I1126 13:52:25.555094 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerStarted","Data":"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9"} Nov 26 13:52:25 crc kubenswrapper[5200]: I1126 13:52:25.559792 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4bdf37de-cbcd-4971-a688-f3b91cf98b47","Type":"ContainerStarted","Data":"f77ef15f63cb09956b5bfe2dd8b835072429e31cb25ce434a16fdc1b92dc9dea"} Nov 26 13:52:26 crc kubenswrapper[5200]: I1126 13:52:26.047493 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:26 crc kubenswrapper[5200]: I1126 13:52:26.573521 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"046ac75d-4cfe-40ed-aca5-a5875ea641d8","Type":"ContainerStarted","Data":"03b29c837a967963b6848307dc062dac95cb5e7b79937c757a26948129ebd5ec"} Nov 26 13:52:26 crc kubenswrapper[5200]: I1126 13:52:26.576674 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerStarted","Data":"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b"} Nov 26 13:52:26 crc kubenswrapper[5200]: I1126 13:52:26.582355 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4bdf37de-cbcd-4971-a688-f3b91cf98b47","Type":"ContainerStarted","Data":"f8133fd16248ab3367c5964464b825f0d076ca73089fbe5d382745f4baeb4dd1"} Nov 26 13:52:26 crc kubenswrapper[5200]: I1126 13:52:26.608776 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.608753733 podStartE2EDuration="4.608753733s" podCreationTimestamp="2025-11-26 13:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:52:26.597098925 +0000 UTC m=+1315.276300753" watchObservedRunningTime="2025-11-26 13:52:26.608753733 +0000 UTC m=+1315.287955551" Nov 26 13:52:26 crc kubenswrapper[5200]: I1126 13:52:26.638897 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.638861914 podStartE2EDuration="4.638861914s" podCreationTimestamp="2025-11-26 13:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:52:26.637724617 +0000 UTC m=+1315.316926435" watchObservedRunningTime="2025-11-26 13:52:26.638861914 +0000 UTC m=+1315.318063732" Nov 26 13:52:27 crc kubenswrapper[5200]: I1126 13:52:27.595124 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerStarted","Data":"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203"} Nov 26 13:52:28 crc kubenswrapper[5200]: I1126 13:52:28.611295 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerStarted","Data":"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a"} Nov 26 13:52:28 crc kubenswrapper[5200]: I1126 13:52:28.611973 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 13:52:28 crc kubenswrapper[5200]: I1126 13:52:28.611719 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="proxy-httpd" containerID="cri-o://d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a" gracePeriod=30 Nov 26 13:52:28 crc kubenswrapper[5200]: I1126 13:52:28.611650 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="ceilometer-central-agent" containerID="cri-o://ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9" gracePeriod=30 Nov 26 13:52:28 crc kubenswrapper[5200]: I1126 13:52:28.611660 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="ceilometer-notification-agent" containerID="cri-o://4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b" gracePeriod=30 Nov 26 13:52:28 crc kubenswrapper[5200]: I1126 13:52:28.611735 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="sg-core" containerID="cri-o://7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203" gracePeriod=30 Nov 26 13:52:28 crc kubenswrapper[5200]: I1126 13:52:28.645954 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8177671800000001 podStartE2EDuration="6.645934015s" podCreationTimestamp="2025-11-26 13:52:22 +0000 UTC" firstStartedPulling="2025-11-26 13:52:23.375489003 +0000 UTC m=+1312.054690821" lastFinishedPulling="2025-11-26 13:52:28.203655838 +0000 UTC m=+1316.882857656" observedRunningTime="2025-11-26 13:52:28.638547598 +0000 UTC m=+1317.317749426" watchObservedRunningTime="2025-11-26 13:52:28.645934015 +0000 UTC m=+1317.325135833" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.526111 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.561065 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626402 5200 generic.go:358] "Generic (PLEG): container finished" podID="2bf616d2-17fd-4cff-985e-605babde3455" containerID="d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a" exitCode=0 Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626434 5200 generic.go:358] "Generic (PLEG): container finished" podID="2bf616d2-17fd-4cff-985e-605babde3455" containerID="7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203" exitCode=2 Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626442 5200 generic.go:358] "Generic (PLEG): container finished" podID="2bf616d2-17fd-4cff-985e-605babde3455" containerID="4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b" exitCode=0 Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626448 5200 generic.go:358] "Generic (PLEG): container finished" podID="2bf616d2-17fd-4cff-985e-605babde3455" containerID="ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9" exitCode=0 Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626593 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerDied","Data":"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a"} Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626627 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerDied","Data":"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203"} Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626641 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerDied","Data":"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b"} Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626651 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerDied","Data":"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9"} Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626661 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bf616d2-17fd-4cff-985e-605babde3455","Type":"ContainerDied","Data":"5dcc19dabc9f53d2d4c85021f0414aae32d2345ad7d93386628d6a76cf63a7e1"} Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626693 5200 scope.go:117] "RemoveContainer" containerID="d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.626887 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.670303 5200 scope.go:117] "RemoveContainer" containerID="7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.700921 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-log-httpd\") pod \"2bf616d2-17fd-4cff-985e-605babde3455\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.701090 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-sg-core-conf-yaml\") pod \"2bf616d2-17fd-4cff-985e-605babde3455\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.701237 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvcht\" (UniqueName: \"kubernetes.io/projected/2bf616d2-17fd-4cff-985e-605babde3455-kube-api-access-jvcht\") pod \"2bf616d2-17fd-4cff-985e-605babde3455\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.701268 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-combined-ca-bundle\") pod \"2bf616d2-17fd-4cff-985e-605babde3455\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.701347 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-run-httpd\") pod \"2bf616d2-17fd-4cff-985e-605babde3455\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.701407 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-scripts\") pod \"2bf616d2-17fd-4cff-985e-605babde3455\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.701738 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-config-data\") pod \"2bf616d2-17fd-4cff-985e-605babde3455\" (UID: \"2bf616d2-17fd-4cff-985e-605babde3455\") " Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.702596 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2bf616d2-17fd-4cff-985e-605babde3455" (UID: "2bf616d2-17fd-4cff-985e-605babde3455"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.707943 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2bf616d2-17fd-4cff-985e-605babde3455" (UID: "2bf616d2-17fd-4cff-985e-605babde3455"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.718890 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-scripts" (OuterVolumeSpecName: "scripts") pod "2bf616d2-17fd-4cff-985e-605babde3455" (UID: "2bf616d2-17fd-4cff-985e-605babde3455"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.719091 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf616d2-17fd-4cff-985e-605babde3455-kube-api-access-jvcht" (OuterVolumeSpecName: "kube-api-access-jvcht") pod "2bf616d2-17fd-4cff-985e-605babde3455" (UID: "2bf616d2-17fd-4cff-985e-605babde3455"). InnerVolumeSpecName "kube-api-access-jvcht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.725155 5200 scope.go:117] "RemoveContainer" containerID="4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.757924 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2bf616d2-17fd-4cff-985e-605babde3455" (UID: "2bf616d2-17fd-4cff-985e-605babde3455"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.804641 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.805073 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.805095 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvcht\" (UniqueName: \"kubernetes.io/projected/2bf616d2-17fd-4cff-985e-605babde3455-kube-api-access-jvcht\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.805105 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bf616d2-17fd-4cff-985e-605babde3455-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.805115 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.863671 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bf616d2-17fd-4cff-985e-605babde3455" (UID: "2bf616d2-17fd-4cff-985e-605babde3455"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.911242 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.918118 5200 scope.go:117] "RemoveContainer" containerID="ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.925200 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-config-data" (OuterVolumeSpecName: "config-data") pod "2bf616d2-17fd-4cff-985e-605babde3455" (UID: "2bf616d2-17fd-4cff-985e-605babde3455"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.953193 5200 scope.go:117] "RemoveContainer" containerID="d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a" Nov 26 13:52:29 crc kubenswrapper[5200]: E1126 13:52:29.956009 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": container with ID starting with d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a not found: ID does not exist" containerID="d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.956067 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a"} err="failed to get container status \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": rpc error: code = NotFound desc = could not find container \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": container with ID starting with d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.956094 5200 scope.go:117] "RemoveContainer" containerID="7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203" Nov 26 13:52:29 crc kubenswrapper[5200]: E1126 13:52:29.956501 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": container with ID starting with 7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203 not found: ID does not exist" containerID="7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.958513 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203"} err="failed to get container status \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": rpc error: code = NotFound desc = could not find container \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": container with ID starting with 7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203 not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.958542 5200 scope.go:117] "RemoveContainer" containerID="4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b" Nov 26 13:52:29 crc kubenswrapper[5200]: E1126 13:52:29.958855 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": container with ID starting with 4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b not found: ID does not exist" containerID="4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.958884 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b"} err="failed to get container status \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": rpc error: code = NotFound desc = could not find container \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": container with ID starting with 4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.958899 5200 scope.go:117] "RemoveContainer" containerID="ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9" Nov 26 13:52:29 crc kubenswrapper[5200]: E1126 13:52:29.959142 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": container with ID starting with ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9 not found: ID does not exist" containerID="ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.959163 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9"} err="failed to get container status \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": rpc error: code = NotFound desc = could not find container \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": container with ID starting with ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9 not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.959176 5200 scope.go:117] "RemoveContainer" containerID="d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.959778 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a"} err="failed to get container status \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": rpc error: code = NotFound desc = could not find container \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": container with ID starting with d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.959798 5200 scope.go:117] "RemoveContainer" containerID="7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.990379 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203"} err="failed to get container status \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": rpc error: code = NotFound desc = could not find container \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": container with ID starting with 7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203 not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.990445 5200 scope.go:117] "RemoveContainer" containerID="4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.991499 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b"} err="failed to get container status \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": rpc error: code = NotFound desc = could not find container \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": container with ID starting with 4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.991568 5200 scope.go:117] "RemoveContainer" containerID="ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.992270 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9"} err="failed to get container status \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": rpc error: code = NotFound desc = could not find container \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": container with ID starting with ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9 not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.992291 5200 scope.go:117] "RemoveContainer" containerID="d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.992766 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a"} err="failed to get container status \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": rpc error: code = NotFound desc = could not find container \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": container with ID starting with d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.992814 5200 scope.go:117] "RemoveContainer" containerID="7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.993502 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203"} err="failed to get container status \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": rpc error: code = NotFound desc = could not find container \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": container with ID starting with 7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203 not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.993573 5200 scope.go:117] "RemoveContainer" containerID="4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.998314 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b"} err="failed to get container status \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": rpc error: code = NotFound desc = could not find container \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": container with ID starting with 4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.998486 5200 scope.go:117] "RemoveContainer" containerID="ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.999327 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9"} err="failed to get container status \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": rpc error: code = NotFound desc = could not find container \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": container with ID starting with ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9 not found: ID does not exist" Nov 26 13:52:29 crc kubenswrapper[5200]: I1126 13:52:29.999356 5200 scope.go:117] "RemoveContainer" containerID="d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.002467 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a"} err="failed to get container status \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": rpc error: code = NotFound desc = could not find container \"d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a\": container with ID starting with d27c32492de0c1130c54823cb36e2a779fe813d9edbddb9d616e11db70bc1e3a not found: ID does not exist" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.002863 5200 scope.go:117] "RemoveContainer" containerID="7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.003668 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203"} err="failed to get container status \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": rpc error: code = NotFound desc = could not find container \"7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203\": container with ID starting with 7e03a47cb353efbe1b9267895e4b9ec8961737dc8c27572b7d422feff6ad6203 not found: ID does not exist" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.012570 5200 scope.go:117] "RemoveContainer" containerID="4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.024632 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf616d2-17fd-4cff-985e-605babde3455-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.033620 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b"} err="failed to get container status \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": rpc error: code = NotFound desc = could not find container \"4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b\": container with ID starting with 4ce90c06b98ed288e9c923b3ef1d49bcf798893a1e59e4936fd084bdd3260c3b not found: ID does not exist" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.033830 5200 scope.go:117] "RemoveContainer" containerID="ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.034522 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9"} err="failed to get container status \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": rpc error: code = NotFound desc = could not find container \"ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9\": container with ID starting with ed37fb75dc11d6c65049e7cfb030277dc528daf2bd27fc252e3fec2c330374d9 not found: ID does not exist" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.052476 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.072755 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.084870 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.086867 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="ceilometer-central-agent" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.086890 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="ceilometer-central-agent" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.086913 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="sg-core" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.086922 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="sg-core" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.086956 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="proxy-httpd" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.086963 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="proxy-httpd" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.086989 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="ceilometer-notification-agent" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.086996 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="ceilometer-notification-agent" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.087186 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="proxy-httpd" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.087204 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="ceilometer-central-agent" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.087216 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="ceilometer-notification-agent" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.087225 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bf616d2-17fd-4cff-985e-605babde3455" containerName="sg-core" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.096015 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.096800 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.100264 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.100583 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.127044 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-run-httpd\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.127082 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-scripts\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.127142 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twx9z\" (UniqueName: \"kubernetes.io/projected/05b1b251-bdcc-4116-8606-ce6d963ca918-kube-api-access-twx9z\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.127164 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.127205 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-log-httpd\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.127264 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.127289 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-config-data\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.130440 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.229649 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-log-httpd\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.229751 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.229782 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-config-data\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.229846 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-run-httpd\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.229864 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-scripts\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.229920 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twx9z\" (UniqueName: \"kubernetes.io/projected/05b1b251-bdcc-4116-8606-ce6d963ca918-kube-api-access-twx9z\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.229941 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.231269 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-log-httpd\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.232583 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-run-httpd\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.234843 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.235530 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-config-data\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.236353 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-scripts\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.236910 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.253500 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twx9z\" (UniqueName: \"kubernetes.io/projected/05b1b251-bdcc-4116-8606-ce6d963ca918-kube-api-access-twx9z\") pod \"ceilometer-0\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.331432 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-combined-ca-bundle\") pod \"0effc02a-2b10-417e-b686-635f9578dfb7\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.331847 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-config\") pod \"0effc02a-2b10-417e-b686-635f9578dfb7\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.332552 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kcv5\" (UniqueName: \"kubernetes.io/projected/0effc02a-2b10-417e-b686-635f9578dfb7-kube-api-access-2kcv5\") pod \"0effc02a-2b10-417e-b686-635f9578dfb7\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.332794 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-ovndb-tls-certs\") pod \"0effc02a-2b10-417e-b686-635f9578dfb7\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.333401 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-httpd-config\") pod \"0effc02a-2b10-417e-b686-635f9578dfb7\" (UID: \"0effc02a-2b10-417e-b686-635f9578dfb7\") " Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.337985 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0effc02a-2b10-417e-b686-635f9578dfb7-kube-api-access-2kcv5" (OuterVolumeSpecName: "kube-api-access-2kcv5") pod "0effc02a-2b10-417e-b686-635f9578dfb7" (UID: "0effc02a-2b10-417e-b686-635f9578dfb7"). InnerVolumeSpecName "kube-api-access-2kcv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.357666 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0effc02a-2b10-417e-b686-635f9578dfb7" (UID: "0effc02a-2b10-417e-b686-635f9578dfb7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.404757 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-config" (OuterVolumeSpecName: "config") pod "0effc02a-2b10-417e-b686-635f9578dfb7" (UID: "0effc02a-2b10-417e-b686-635f9578dfb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.418388 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.435983 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.436028 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.436042 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kcv5\" (UniqueName: \"kubernetes.io/projected/0effc02a-2b10-417e-b686-635f9578dfb7-kube-api-access-2kcv5\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.439413 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0effc02a-2b10-417e-b686-635f9578dfb7" (UID: "0effc02a-2b10-417e-b686-635f9578dfb7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.442323 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0effc02a-2b10-417e-b686-635f9578dfb7" (UID: "0effc02a-2b10-417e-b686-635f9578dfb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.538885 5200 reconciler_common.go:299] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.539222 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0effc02a-2b10-417e-b686-635f9578dfb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.642413 5200 generic.go:358] "Generic (PLEG): container finished" podID="0effc02a-2b10-417e-b686-635f9578dfb7" containerID="2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a" exitCode=0 Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.642547 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5777f78f56-8db7l" event={"ID":"0effc02a-2b10-417e-b686-635f9578dfb7","Type":"ContainerDied","Data":"2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a"} Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.642728 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5777f78f56-8db7l" event={"ID":"0effc02a-2b10-417e-b686-635f9578dfb7","Type":"ContainerDied","Data":"0b9815d881e963dacf0a891cbc1d8788bd297d803eafecb9ab46f1ed9dc2c3ac"} Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.642760 5200 scope.go:117] "RemoveContainer" containerID="cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.643033 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5777f78f56-8db7l" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.690482 5200 scope.go:117] "RemoveContainer" containerID="2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.722756 5200 scope.go:117] "RemoveContainer" containerID="cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7" Nov 26 13:52:30 crc kubenswrapper[5200]: E1126 13:52:30.726419 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7\": container with ID starting with cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7 not found: ID does not exist" containerID="cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.726480 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7"} err="failed to get container status \"cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7\": rpc error: code = NotFound desc = could not find container \"cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7\": container with ID starting with cb3ba26a87dc61202b40654f04031a3a09664b89683223b37c903b601ebf3ed7 not found: ID does not exist" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.726509 5200 scope.go:117] "RemoveContainer" containerID="2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a" Nov 26 13:52:30 crc kubenswrapper[5200]: E1126 13:52:30.727543 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a\": container with ID starting with 2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a not found: ID does not exist" containerID="2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.727568 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a"} err="failed to get container status \"2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a\": rpc error: code = NotFound desc = could not find container \"2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a\": container with ID starting with 2590a875d6e443ed343e8579ce9425edf35fc5d3a2f1b36d09b42ddc2be8f00a not found: ID does not exist" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.728716 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5777f78f56-8db7l"] Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.742062 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5777f78f56-8db7l"] Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.887167 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:30 crc kubenswrapper[5200]: W1126 13:52:30.888578 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05b1b251_bdcc_4116_8606_ce6d963ca918.slice/crio-7d2fc39dca2de13397a14ca09726d07f9982ad96fa78e3d09162ddf73c7c8dd1 WatchSource:0}: Error finding container 7d2fc39dca2de13397a14ca09726d07f9982ad96fa78e3d09162ddf73c7c8dd1: Status 404 returned error can't find the container with id 7d2fc39dca2de13397a14ca09726d07f9982ad96fa78e3d09162ddf73c7c8dd1 Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.984253 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" path="/var/lib/kubelet/pods/0effc02a-2b10-417e-b686-635f9578dfb7/volumes" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.985428 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf616d2-17fd-4cff-985e-605babde3455" path="/var/lib/kubelet/pods/2bf616d2-17fd-4cff-985e-605babde3455/volumes" Nov 26 13:52:30 crc kubenswrapper[5200]: I1126 13:52:30.987015 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:31 crc kubenswrapper[5200]: I1126 13:52:31.669082 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerStarted","Data":"7d2fc39dca2de13397a14ca09726d07f9982ad96fa78e3d09162ddf73c7c8dd1"} Nov 26 13:52:32 crc kubenswrapper[5200]: I1126 13:52:32.688139 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerStarted","Data":"14472cb1362fed56614abd2f3cfbc49c72f1d300d9597558e3d6810ed9a5890b"} Nov 26 13:52:32 crc kubenswrapper[5200]: I1126 13:52:32.688974 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerStarted","Data":"ba732d56e77d5e6b2577e3ec7571200684e3ec61d202b05566fabb682a9c0938"} Nov 26 13:52:32 crc kubenswrapper[5200]: I1126 13:52:32.955590 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:32 crc kubenswrapper[5200]: I1126 13:52:32.955644 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.013562 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.018165 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.256176 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.258101 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.313098 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.354032 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.715597 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerStarted","Data":"7e02c095fa5f94d68944d6e6025e4176a5f901bb556cf67a3c9b8af2661931ad"} Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.717558 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.717604 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.717618 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:33 crc kubenswrapper[5200]: I1126 13:52:33.717627 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.775511 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerStarted","Data":"aee7031d190bd100dde3fc076d32450de4117619575c422244a934bfe2ab3ae5"} Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.776384 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="ceilometer-central-agent" containerID="cri-o://ba732d56e77d5e6b2577e3ec7571200684e3ec61d202b05566fabb682a9c0938" gracePeriod=30 Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.776777 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.776802 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="proxy-httpd" containerID="cri-o://aee7031d190bd100dde3fc076d32450de4117619575c422244a934bfe2ab3ae5" gracePeriod=30 Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.776882 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="sg-core" containerID="cri-o://7e02c095fa5f94d68944d6e6025e4176a5f901bb556cf67a3c9b8af2661931ad" gracePeriod=30 Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.776939 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="ceilometer-notification-agent" containerID="cri-o://14472cb1362fed56614abd2f3cfbc49c72f1d300d9597558e3d6810ed9a5890b" gracePeriod=30 Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.805979 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.104614281 podStartE2EDuration="6.805959136s" podCreationTimestamp="2025-11-26 13:52:29 +0000 UTC" firstStartedPulling="2025-11-26 13:52:30.890158233 +0000 UTC m=+1319.569360051" lastFinishedPulling="2025-11-26 13:52:34.591503098 +0000 UTC m=+1323.270704906" observedRunningTime="2025-11-26 13:52:35.805490665 +0000 UTC m=+1324.484692483" watchObservedRunningTime="2025-11-26 13:52:35.805959136 +0000 UTC m=+1324.485160974" Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.812790 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.812912 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 13:52:35 crc kubenswrapper[5200]: I1126 13:52:35.815802 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.060046 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.060431 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.799586 5200 generic.go:358] "Generic (PLEG): container finished" podID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerID="aee7031d190bd100dde3fc076d32450de4117619575c422244a934bfe2ab3ae5" exitCode=0 Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.799636 5200 generic.go:358] "Generic (PLEG): container finished" podID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerID="7e02c095fa5f94d68944d6e6025e4176a5f901bb556cf67a3c9b8af2661931ad" exitCode=2 Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.799643 5200 generic.go:358] "Generic (PLEG): container finished" podID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerID="14472cb1362fed56614abd2f3cfbc49c72f1d300d9597558e3d6810ed9a5890b" exitCode=0 Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.800285 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerDied","Data":"aee7031d190bd100dde3fc076d32450de4117619575c422244a934bfe2ab3ae5"} Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.800359 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerDied","Data":"7e02c095fa5f94d68944d6e6025e4176a5f901bb556cf67a3c9b8af2661931ad"} Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.800373 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerDied","Data":"14472cb1362fed56614abd2f3cfbc49c72f1d300d9597558e3d6810ed9a5890b"} Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.843997 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-c2h6h"] Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.848701 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" containerName="neutron-httpd" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.848772 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" containerName="neutron-httpd" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.848815 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" containerName="neutron-api" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.848824 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" containerName="neutron-api" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.849425 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" containerName="neutron-httpd" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.849471 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0effc02a-2b10-417e-b686-635f9578dfb7" containerName="neutron-api" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.866679 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.876948 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c2h6h"] Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.968780 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jjw96"] Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.983118 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqmf2\" (UniqueName: \"kubernetes.io/projected/1efdd422-f937-4662-a6f4-3b0e6b0232dd-kube-api-access-sqmf2\") pod \"nova-api-db-create-c2h6h\" (UID: \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\") " pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.983281 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efdd422-f937-4662-a6f4-3b0e6b0232dd-operator-scripts\") pod \"nova-api-db-create-c2h6h\" (UID: \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\") " pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:36 crc kubenswrapper[5200]: I1126 13:52:36.983598 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.025171 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5c5d-account-create-update-r2hfq"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.032727 5200 pod_container_manager_linux.go:217] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9a16cfaf-3944-4dce-bb5d-9e6bce2ed392"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9a16cfaf-3944-4dce-bb5d-9e6bce2ed392] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9a16cfaf_3944_4dce_bb5d_9e6bce2ed392.slice" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.049926 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5c5d-account-create-update-r2hfq"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.049980 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jjw96"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.050095 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.053272 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-db-secret\"" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.073579 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zkjwj"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.080120 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.085461 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efdd422-f937-4662-a6f4-3b0e6b0232dd-operator-scripts\") pod \"nova-api-db-create-c2h6h\" (UID: \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\") " pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.085521 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db49787b-26a2-4daf-a2fc-040890054555-operator-scripts\") pod \"nova-cell0-db-create-jjw96\" (UID: \"db49787b-26a2-4daf-a2fc-040890054555\") " pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.085642 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqmf2\" (UniqueName: \"kubernetes.io/projected/1efdd422-f937-4662-a6f4-3b0e6b0232dd-kube-api-access-sqmf2\") pod \"nova-api-db-create-c2h6h\" (UID: \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\") " pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.085665 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr728\" (UniqueName: \"kubernetes.io/projected/db49787b-26a2-4daf-a2fc-040890054555-kube-api-access-gr728\") pod \"nova-cell0-db-create-jjw96\" (UID: \"db49787b-26a2-4daf-a2fc-040890054555\") " pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.087799 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efdd422-f937-4662-a6f4-3b0e6b0232dd-operator-scripts\") pod \"nova-api-db-create-c2h6h\" (UID: \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\") " pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.103752 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zkjwj"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.152294 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqmf2\" (UniqueName: \"kubernetes.io/projected/1efdd422-f937-4662-a6f4-3b0e6b0232dd-kube-api-access-sqmf2\") pod \"nova-api-db-create-c2h6h\" (UID: \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\") " pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.166204 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e30c-account-create-update-shf9k"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.179929 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e30c-account-create-update-shf9k"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.180076 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.194309 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-operator-scripts\") pod \"nova-cell1-db-create-zkjwj\" (UID: \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\") " pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.194389 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr728\" (UniqueName: \"kubernetes.io/projected/db49787b-26a2-4daf-a2fc-040890054555-kube-api-access-gr728\") pod \"nova-cell0-db-create-jjw96\" (UID: \"db49787b-26a2-4daf-a2fc-040890054555\") " pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.194427 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9254e1be-19ef-4af5-b8f8-ecea111851d9-operator-scripts\") pod \"nova-api-5c5d-account-create-update-r2hfq\" (UID: \"9254e1be-19ef-4af5-b8f8-ecea111851d9\") " pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.194875 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db49787b-26a2-4daf-a2fc-040890054555-operator-scripts\") pod \"nova-cell0-db-create-jjw96\" (UID: \"db49787b-26a2-4daf-a2fc-040890054555\") " pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.194949 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8frq\" (UniqueName: \"kubernetes.io/projected/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-kube-api-access-k8frq\") pod \"nova-cell1-db-create-zkjwj\" (UID: \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\") " pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.195045 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jzn\" (UniqueName: \"kubernetes.io/projected/9254e1be-19ef-4af5-b8f8-ecea111851d9-kube-api-access-m6jzn\") pod \"nova-api-5c5d-account-create-update-r2hfq\" (UID: \"9254e1be-19ef-4af5-b8f8-ecea111851d9\") " pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.197378 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-db-secret\"" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.200216 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db49787b-26a2-4daf-a2fc-040890054555-operator-scripts\") pod \"nova-cell0-db-create-jjw96\" (UID: \"db49787b-26a2-4daf-a2fc-040890054555\") " pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.212130 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.218651 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr728\" (UniqueName: \"kubernetes.io/projected/db49787b-26a2-4daf-a2fc-040890054555-kube-api-access-gr728\") pod \"nova-cell0-db-create-jjw96\" (UID: \"db49787b-26a2-4daf-a2fc-040890054555\") " pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.297498 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jzn\" (UniqueName: \"kubernetes.io/projected/9254e1be-19ef-4af5-b8f8-ecea111851d9-kube-api-access-m6jzn\") pod \"nova-api-5c5d-account-create-update-r2hfq\" (UID: \"9254e1be-19ef-4af5-b8f8-ecea111851d9\") " pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.297929 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-operator-scripts\") pod \"nova-cell1-db-create-zkjwj\" (UID: \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\") " pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.297972 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9254e1be-19ef-4af5-b8f8-ecea111851d9-operator-scripts\") pod \"nova-api-5c5d-account-create-update-r2hfq\" (UID: \"9254e1be-19ef-4af5-b8f8-ecea111851d9\") " pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.298084 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d4f6\" (UniqueName: \"kubernetes.io/projected/7173d1b4-2e90-4382-865d-4736322f38fa-kube-api-access-8d4f6\") pod \"nova-cell0-e30c-account-create-update-shf9k\" (UID: \"7173d1b4-2e90-4382-865d-4736322f38fa\") " pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.298130 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7173d1b4-2e90-4382-865d-4736322f38fa-operator-scripts\") pod \"nova-cell0-e30c-account-create-update-shf9k\" (UID: \"7173d1b4-2e90-4382-865d-4736322f38fa\") " pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.298157 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8frq\" (UniqueName: \"kubernetes.io/projected/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-kube-api-access-k8frq\") pod \"nova-cell1-db-create-zkjwj\" (UID: \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\") " pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.299087 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9254e1be-19ef-4af5-b8f8-ecea111851d9-operator-scripts\") pod \"nova-api-5c5d-account-create-update-r2hfq\" (UID: \"9254e1be-19ef-4af5-b8f8-ecea111851d9\") " pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.299909 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-operator-scripts\") pod \"nova-cell1-db-create-zkjwj\" (UID: \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\") " pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.325079 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.325414 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jzn\" (UniqueName: \"kubernetes.io/projected/9254e1be-19ef-4af5-b8f8-ecea111851d9-kube-api-access-m6jzn\") pod \"nova-api-5c5d-account-create-update-r2hfq\" (UID: \"9254e1be-19ef-4af5-b8f8-ecea111851d9\") " pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.325421 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8frq\" (UniqueName: \"kubernetes.io/projected/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-kube-api-access-k8frq\") pod \"nova-cell1-db-create-zkjwj\" (UID: \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\") " pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.378856 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ed9a-account-create-update-6qjv9"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.388931 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.391611 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.397251 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-db-secret\"" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.402895 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4f6\" (UniqueName: \"kubernetes.io/projected/7173d1b4-2e90-4382-865d-4736322f38fa-kube-api-access-8d4f6\") pod \"nova-cell0-e30c-account-create-update-shf9k\" (UID: \"7173d1b4-2e90-4382-865d-4736322f38fa\") " pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.403065 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7173d1b4-2e90-4382-865d-4736322f38fa-operator-scripts\") pod \"nova-cell0-e30c-account-create-update-shf9k\" (UID: \"7173d1b4-2e90-4382-865d-4736322f38fa\") " pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.403902 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7173d1b4-2e90-4382-865d-4736322f38fa-operator-scripts\") pod \"nova-cell0-e30c-account-create-update-shf9k\" (UID: \"7173d1b4-2e90-4382-865d-4736322f38fa\") " pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.405372 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.435934 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ed9a-account-create-update-6qjv9"] Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.446913 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4f6\" (UniqueName: \"kubernetes.io/projected/7173d1b4-2e90-4382-865d-4736322f38fa-kube-api-access-8d4f6\") pod \"nova-cell0-e30c-account-create-update-shf9k\" (UID: \"7173d1b4-2e90-4382-865d-4736322f38fa\") " pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.513842 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86qpk\" (UniqueName: \"kubernetes.io/projected/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-kube-api-access-86qpk\") pod \"nova-cell1-ed9a-account-create-update-6qjv9\" (UID: \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\") " pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.514351 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-operator-scripts\") pod \"nova-cell1-ed9a-account-create-update-6qjv9\" (UID: \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\") " pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.518151 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.619746 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-operator-scripts\") pod \"nova-cell1-ed9a-account-create-update-6qjv9\" (UID: \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\") " pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.620134 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86qpk\" (UniqueName: \"kubernetes.io/projected/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-kube-api-access-86qpk\") pod \"nova-cell1-ed9a-account-create-update-6qjv9\" (UID: \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\") " pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.621568 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-operator-scripts\") pod \"nova-cell1-ed9a-account-create-update-6qjv9\" (UID: \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\") " pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.650179 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86qpk\" (UniqueName: \"kubernetes.io/projected/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-kube-api-access-86qpk\") pod \"nova-cell1-ed9a-account-create-update-6qjv9\" (UID: \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\") " pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.753803 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:37 crc kubenswrapper[5200]: I1126 13:52:37.835355 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c2h6h"] Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.070695 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jjw96"] Nov 26 13:52:38 crc kubenswrapper[5200]: W1126 13:52:38.075599 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb49787b_26a2_4daf_a2fc_040890054555.slice/crio-302cf3bfba759080ead5c5932999ee24b54469d7074964d3130bbe560505bbba WatchSource:0}: Error finding container 302cf3bfba759080ead5c5932999ee24b54469d7074964d3130bbe560505bbba: Status 404 returned error can't find the container with id 302cf3bfba759080ead5c5932999ee24b54469d7074964d3130bbe560505bbba Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.164081 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5c5d-account-create-update-r2hfq"] Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.193480 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zkjwj"] Nov 26 13:52:38 crc kubenswrapper[5200]: W1126 13:52:38.365859 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7173d1b4_2e90_4382_865d_4736322f38fa.slice/crio-54a52e213d558a1863d828e2c486e3a78fbcec0fbd70104e09c228c5e50ad83f WatchSource:0}: Error finding container 54a52e213d558a1863d828e2c486e3a78fbcec0fbd70104e09c228c5e50ad83f: Status 404 returned error can't find the container with id 54a52e213d558a1863d828e2c486e3a78fbcec0fbd70104e09c228c5e50ad83f Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.368335 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e30c-account-create-update-shf9k"] Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.566670 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ed9a-account-create-update-6qjv9"] Nov 26 13:52:38 crc kubenswrapper[5200]: W1126 13:52:38.570640 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1af3d0a6_007c_4012_9f5b_2efe1e0b5a8d.slice/crio-7074b774e56c628dea1a437f1eacd8617fc3693e154e1e1468769b35944142b7 WatchSource:0}: Error finding container 7074b774e56c628dea1a437f1eacd8617fc3693e154e1e1468769b35944142b7: Status 404 returned error can't find the container with id 7074b774e56c628dea1a437f1eacd8617fc3693e154e1e1468769b35944142b7 Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.847945 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zkjwj" event={"ID":"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5","Type":"ContainerStarted","Data":"45b0a0d376caba5a2413c25066270cd6614242985b7d52550ad69cc6170e8cd6"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.847999 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zkjwj" event={"ID":"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5","Type":"ContainerStarted","Data":"29ffc6bbc85ed3fa68b0983329d5a5161b78c536457dc6b54a8edef0d55fc4c4"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.859243 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jjw96" event={"ID":"db49787b-26a2-4daf-a2fc-040890054555","Type":"ContainerStarted","Data":"2460558ab936aba0248046a3b63a15a776f86be31eb3e11aa26efff50eeff85c"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.859298 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jjw96" event={"ID":"db49787b-26a2-4daf-a2fc-040890054555","Type":"ContainerStarted","Data":"302cf3bfba759080ead5c5932999ee24b54469d7074964d3130bbe560505bbba"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.864952 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" event={"ID":"9254e1be-19ef-4af5-b8f8-ecea111851d9","Type":"ContainerStarted","Data":"fdb72c20a0584eddffd66dbae804db6f68e5c2f1e04b39b071c742e0af14da1f"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.865422 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" event={"ID":"9254e1be-19ef-4af5-b8f8-ecea111851d9","Type":"ContainerStarted","Data":"d57ac7fc0a0c90fb25bfcd8f6d5cdbd7fca6633de71cd0e32b725b3449570898"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.873001 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-zkjwj" podStartSLOduration=1.872987859 podStartE2EDuration="1.872987859s" podCreationTimestamp="2025-11-26 13:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:52:38.869615588 +0000 UTC m=+1327.548817406" watchObservedRunningTime="2025-11-26 13:52:38.872987859 +0000 UTC m=+1327.552189677" Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.873816 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" event={"ID":"7173d1b4-2e90-4382-865d-4736322f38fa","Type":"ContainerStarted","Data":"c9318e677b85daa3ac00ae27709df9fa64f05c09a028ed186bea46eb31efe3bf"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.873881 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" event={"ID":"7173d1b4-2e90-4382-865d-4736322f38fa","Type":"ContainerStarted","Data":"54a52e213d558a1863d828e2c486e3a78fbcec0fbd70104e09c228c5e50ad83f"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.877318 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c2h6h" event={"ID":"1efdd422-f937-4662-a6f4-3b0e6b0232dd","Type":"ContainerStarted","Data":"4aa8b6c3334d256efe21f55e401990fa88f4f0e637a7442e68b7624c21e7f69a"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.877366 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c2h6h" event={"ID":"1efdd422-f937-4662-a6f4-3b0e6b0232dd","Type":"ContainerStarted","Data":"fec553621f42ca7aa7659ff681600e02bffe2130b8935d161c94b41548c2bb5a"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.889587 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" event={"ID":"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d","Type":"ContainerStarted","Data":"f2ee82a44b7d1de45d8bea7c8864c1381ebc8a9c06ecf8f3807c22a0dda52190"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.889645 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" event={"ID":"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d","Type":"ContainerStarted","Data":"7074b774e56c628dea1a437f1eacd8617fc3693e154e1e1468769b35944142b7"} Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.890868 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-jjw96" podStartSLOduration=2.890847156 podStartE2EDuration="2.890847156s" podCreationTimestamp="2025-11-26 13:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:52:38.889651417 +0000 UTC m=+1327.568853235" watchObservedRunningTime="2025-11-26 13:52:38.890847156 +0000 UTC m=+1327.570048974" Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.922285 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" podStartSLOduration=2.922257548 podStartE2EDuration="2.922257548s" podCreationTimestamp="2025-11-26 13:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:52:38.914589054 +0000 UTC m=+1327.593790872" watchObservedRunningTime="2025-11-26 13:52:38.922257548 +0000 UTC m=+1327.601459366" Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.942351 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" podStartSLOduration=1.942321058 podStartE2EDuration="1.942321058s" podCreationTimestamp="2025-11-26 13:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:52:38.934563032 +0000 UTC m=+1327.613764860" watchObservedRunningTime="2025-11-26 13:52:38.942321058 +0000 UTC m=+1327.621522896" Nov 26 13:52:38 crc kubenswrapper[5200]: I1126 13:52:38.953988 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" podStartSLOduration=1.9539633969999999 podStartE2EDuration="1.953963397s" podCreationTimestamp="2025-11-26 13:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:52:38.952661226 +0000 UTC m=+1327.631863044" watchObservedRunningTime="2025-11-26 13:52:38.953963397 +0000 UTC m=+1327.633165215" Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.900608 5200 generic.go:358] "Generic (PLEG): container finished" podID="8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5" containerID="45b0a0d376caba5a2413c25066270cd6614242985b7d52550ad69cc6170e8cd6" exitCode=0 Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.900725 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zkjwj" event={"ID":"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5","Type":"ContainerDied","Data":"45b0a0d376caba5a2413c25066270cd6614242985b7d52550ad69cc6170e8cd6"} Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.906043 5200 generic.go:358] "Generic (PLEG): container finished" podID="db49787b-26a2-4daf-a2fc-040890054555" containerID="2460558ab936aba0248046a3b63a15a776f86be31eb3e11aa26efff50eeff85c" exitCode=0 Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.906225 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jjw96" event={"ID":"db49787b-26a2-4daf-a2fc-040890054555","Type":"ContainerDied","Data":"2460558ab936aba0248046a3b63a15a776f86be31eb3e11aa26efff50eeff85c"} Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.908535 5200 generic.go:358] "Generic (PLEG): container finished" podID="9254e1be-19ef-4af5-b8f8-ecea111851d9" containerID="fdb72c20a0584eddffd66dbae804db6f68e5c2f1e04b39b071c742e0af14da1f" exitCode=0 Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.908630 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" event={"ID":"9254e1be-19ef-4af5-b8f8-ecea111851d9","Type":"ContainerDied","Data":"fdb72c20a0584eddffd66dbae804db6f68e5c2f1e04b39b071c742e0af14da1f"} Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.910708 5200 generic.go:358] "Generic (PLEG): container finished" podID="7173d1b4-2e90-4382-865d-4736322f38fa" containerID="c9318e677b85daa3ac00ae27709df9fa64f05c09a028ed186bea46eb31efe3bf" exitCode=0 Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.910878 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" event={"ID":"7173d1b4-2e90-4382-865d-4736322f38fa","Type":"ContainerDied","Data":"c9318e677b85daa3ac00ae27709df9fa64f05c09a028ed186bea46eb31efe3bf"} Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.913047 5200 generic.go:358] "Generic (PLEG): container finished" podID="1efdd422-f937-4662-a6f4-3b0e6b0232dd" containerID="4aa8b6c3334d256efe21f55e401990fa88f4f0e637a7442e68b7624c21e7f69a" exitCode=0 Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.913193 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c2h6h" event={"ID":"1efdd422-f937-4662-a6f4-3b0e6b0232dd","Type":"ContainerDied","Data":"4aa8b6c3334d256efe21f55e401990fa88f4f0e637a7442e68b7624c21e7f69a"} Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.924355 5200 generic.go:358] "Generic (PLEG): container finished" podID="1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d" containerID="f2ee82a44b7d1de45d8bea7c8864c1381ebc8a9c06ecf8f3807c22a0dda52190" exitCode=0 Nov 26 13:52:39 crc kubenswrapper[5200]: I1126 13:52:39.924535 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" event={"ID":"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d","Type":"ContainerDied","Data":"f2ee82a44b7d1de45d8bea7c8864c1381ebc8a9c06ecf8f3807c22a0dda52190"} Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.362899 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.425322 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqmf2\" (UniqueName: \"kubernetes.io/projected/1efdd422-f937-4662-a6f4-3b0e6b0232dd-kube-api-access-sqmf2\") pod \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\" (UID: \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\") " Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.425412 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efdd422-f937-4662-a6f4-3b0e6b0232dd-operator-scripts\") pod \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\" (UID: \"1efdd422-f937-4662-a6f4-3b0e6b0232dd\") " Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.426381 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1efdd422-f937-4662-a6f4-3b0e6b0232dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1efdd422-f937-4662-a6f4-3b0e6b0232dd" (UID: "1efdd422-f937-4662-a6f4-3b0e6b0232dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.431789 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efdd422-f937-4662-a6f4-3b0e6b0232dd-kube-api-access-sqmf2" (OuterVolumeSpecName: "kube-api-access-sqmf2") pod "1efdd422-f937-4662-a6f4-3b0e6b0232dd" (UID: "1efdd422-f937-4662-a6f4-3b0e6b0232dd"). InnerVolumeSpecName "kube-api-access-sqmf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.527866 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sqmf2\" (UniqueName: \"kubernetes.io/projected/1efdd422-f937-4662-a6f4-3b0e6b0232dd-kube-api-access-sqmf2\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.527900 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efdd422-f937-4662-a6f4-3b0e6b0232dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.937069 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c2h6h" Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.937214 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c2h6h" event={"ID":"1efdd422-f937-4662-a6f4-3b0e6b0232dd","Type":"ContainerDied","Data":"fec553621f42ca7aa7659ff681600e02bffe2130b8935d161c94b41548c2bb5a"} Nov 26 13:52:40 crc kubenswrapper[5200]: I1126 13:52:40.937807 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fec553621f42ca7aa7659ff681600e02bffe2130b8935d161c94b41548c2bb5a" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.346105 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.449809 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d4f6\" (UniqueName: \"kubernetes.io/projected/7173d1b4-2e90-4382-865d-4736322f38fa-kube-api-access-8d4f6\") pod \"7173d1b4-2e90-4382-865d-4736322f38fa\" (UID: \"7173d1b4-2e90-4382-865d-4736322f38fa\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.450097 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7173d1b4-2e90-4382-865d-4736322f38fa-operator-scripts\") pod \"7173d1b4-2e90-4382-865d-4736322f38fa\" (UID: \"7173d1b4-2e90-4382-865d-4736322f38fa\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.451329 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7173d1b4-2e90-4382-865d-4736322f38fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7173d1b4-2e90-4382-865d-4736322f38fa" (UID: "7173d1b4-2e90-4382-865d-4736322f38fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.488622 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7173d1b4-2e90-4382-865d-4736322f38fa-kube-api-access-8d4f6" (OuterVolumeSpecName: "kube-api-access-8d4f6") pod "7173d1b4-2e90-4382-865d-4736322f38fa" (UID: "7173d1b4-2e90-4382-865d-4736322f38fa"). InnerVolumeSpecName "kube-api-access-8d4f6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.553819 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8d4f6\" (UniqueName: \"kubernetes.io/projected/7173d1b4-2e90-4382-865d-4736322f38fa-kube-api-access-8d4f6\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.553851 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7173d1b4-2e90-4382-865d-4736322f38fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.631072 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.639223 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.658116 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.671399 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.763342 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db49787b-26a2-4daf-a2fc-040890054555-operator-scripts\") pod \"db49787b-26a2-4daf-a2fc-040890054555\" (UID: \"db49787b-26a2-4daf-a2fc-040890054555\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.763432 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr728\" (UniqueName: \"kubernetes.io/projected/db49787b-26a2-4daf-a2fc-040890054555-kube-api-access-gr728\") pod \"db49787b-26a2-4daf-a2fc-040890054555\" (UID: \"db49787b-26a2-4daf-a2fc-040890054555\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.763665 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8frq\" (UniqueName: \"kubernetes.io/projected/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-kube-api-access-k8frq\") pod \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\" (UID: \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.763764 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6jzn\" (UniqueName: \"kubernetes.io/projected/9254e1be-19ef-4af5-b8f8-ecea111851d9-kube-api-access-m6jzn\") pod \"9254e1be-19ef-4af5-b8f8-ecea111851d9\" (UID: \"9254e1be-19ef-4af5-b8f8-ecea111851d9\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.763817 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86qpk\" (UniqueName: \"kubernetes.io/projected/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-kube-api-access-86qpk\") pod \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\" (UID: \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.763966 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-operator-scripts\") pod \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\" (UID: \"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.763981 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db49787b-26a2-4daf-a2fc-040890054555-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db49787b-26a2-4daf-a2fc-040890054555" (UID: "db49787b-26a2-4daf-a2fc-040890054555"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.764142 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9254e1be-19ef-4af5-b8f8-ecea111851d9-operator-scripts\") pod \"9254e1be-19ef-4af5-b8f8-ecea111851d9\" (UID: \"9254e1be-19ef-4af5-b8f8-ecea111851d9\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.764255 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-operator-scripts\") pod \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\" (UID: \"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5\") " Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.764726 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d" (UID: "1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.765798 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5" (UID: "8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.766729 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db49787b-26a2-4daf-a2fc-040890054555-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.766760 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.766771 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.768848 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9254e1be-19ef-4af5-b8f8-ecea111851d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9254e1be-19ef-4af5-b8f8-ecea111851d9" (UID: "9254e1be-19ef-4af5-b8f8-ecea111851d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.769358 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db49787b-26a2-4daf-a2fc-040890054555-kube-api-access-gr728" (OuterVolumeSpecName: "kube-api-access-gr728") pod "db49787b-26a2-4daf-a2fc-040890054555" (UID: "db49787b-26a2-4daf-a2fc-040890054555"). InnerVolumeSpecName "kube-api-access-gr728". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.778266 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-kube-api-access-86qpk" (OuterVolumeSpecName: "kube-api-access-86qpk") pod "1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d" (UID: "1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d"). InnerVolumeSpecName "kube-api-access-86qpk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.778776 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-kube-api-access-k8frq" (OuterVolumeSpecName: "kube-api-access-k8frq") pod "8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5" (UID: "8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5"). InnerVolumeSpecName "kube-api-access-k8frq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.780845 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9254e1be-19ef-4af5-b8f8-ecea111851d9-kube-api-access-m6jzn" (OuterVolumeSpecName: "kube-api-access-m6jzn") pod "9254e1be-19ef-4af5-b8f8-ecea111851d9" (UID: "9254e1be-19ef-4af5-b8f8-ecea111851d9"). InnerVolumeSpecName "kube-api-access-m6jzn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.868339 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-86qpk\" (UniqueName: \"kubernetes.io/projected/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d-kube-api-access-86qpk\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.868377 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9254e1be-19ef-4af5-b8f8-ecea111851d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.868389 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gr728\" (UniqueName: \"kubernetes.io/projected/db49787b-26a2-4daf-a2fc-040890054555-kube-api-access-gr728\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.868399 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8frq\" (UniqueName: \"kubernetes.io/projected/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5-kube-api-access-k8frq\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.868408 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m6jzn\" (UniqueName: \"kubernetes.io/projected/9254e1be-19ef-4af5-b8f8-ecea111851d9-kube-api-access-m6jzn\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.948813 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" event={"ID":"1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d","Type":"ContainerDied","Data":"7074b774e56c628dea1a437f1eacd8617fc3693e154e1e1468769b35944142b7"} Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.948863 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7074b774e56c628dea1a437f1eacd8617fc3693e154e1e1468769b35944142b7" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.948975 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ed9a-account-create-update-6qjv9" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.951388 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zkjwj" event={"ID":"8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5","Type":"ContainerDied","Data":"29ffc6bbc85ed3fa68b0983329d5a5161b78c536457dc6b54a8edef0d55fc4c4"} Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.951431 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29ffc6bbc85ed3fa68b0983329d5a5161b78c536457dc6b54a8edef0d55fc4c4" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.951474 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zkjwj" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.953192 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jjw96" event={"ID":"db49787b-26a2-4daf-a2fc-040890054555","Type":"ContainerDied","Data":"302cf3bfba759080ead5c5932999ee24b54469d7074964d3130bbe560505bbba"} Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.953219 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302cf3bfba759080ead5c5932999ee24b54469d7074964d3130bbe560505bbba" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.953289 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jjw96" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.957094 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.957130 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5c5d-account-create-update-r2hfq" event={"ID":"9254e1be-19ef-4af5-b8f8-ecea111851d9","Type":"ContainerDied","Data":"d57ac7fc0a0c90fb25bfcd8f6d5cdbd7fca6633de71cd0e32b725b3449570898"} Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.957203 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57ac7fc0a0c90fb25bfcd8f6d5cdbd7fca6633de71cd0e32b725b3449570898" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.958859 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.958864 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e30c-account-create-update-shf9k" event={"ID":"7173d1b4-2e90-4382-865d-4736322f38fa","Type":"ContainerDied","Data":"54a52e213d558a1863d828e2c486e3a78fbcec0fbd70104e09c228c5e50ad83f"} Nov 26 13:52:41 crc kubenswrapper[5200]: I1126 13:52:41.959463 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a52e213d558a1863d828e2c486e3a78fbcec0fbd70104e09c228c5e50ad83f" Nov 26 13:52:42 crc kubenswrapper[5200]: E1126 13:52:42.848422 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1699959 actualBytes=10240 Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.015032 5200 generic.go:358] "Generic (PLEG): container finished" podID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerID="ba732d56e77d5e6b2577e3ec7571200684e3ec61d202b05566fabb682a9c0938" exitCode=0 Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.015112 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerDied","Data":"ba732d56e77d5e6b2577e3ec7571200684e3ec61d202b05566fabb682a9c0938"} Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.170485 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.237201 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c42lw"] Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238547 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="proxy-httpd" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238573 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="proxy-httpd" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238591 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1efdd422-f937-4662-a6f4-3b0e6b0232dd" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238598 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efdd422-f937-4662-a6f4-3b0e6b0232dd" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238606 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9254e1be-19ef-4af5-b8f8-ecea111851d9" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238615 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9254e1be-19ef-4af5-b8f8-ecea111851d9" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238626 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7173d1b4-2e90-4382-865d-4736322f38fa" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238632 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7173d1b4-2e90-4382-865d-4736322f38fa" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238641 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238646 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238663 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db49787b-26a2-4daf-a2fc-040890054555" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238668 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="db49787b-26a2-4daf-a2fc-040890054555" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238694 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="ceilometer-notification-agent" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238702 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="ceilometer-notification-agent" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238717 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="sg-core" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238725 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="sg-core" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238742 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238747 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238762 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="ceilometer-central-agent" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238768 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="ceilometer-central-agent" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238950 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1efdd422-f937-4662-a6f4-3b0e6b0232dd" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238963 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7173d1b4-2e90-4382-865d-4736322f38fa" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238978 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9254e1be-19ef-4af5-b8f8-ecea111851d9" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238986 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d" containerName="mariadb-account-create-update" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.238994 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="proxy-httpd" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.239005 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="db49787b-26a2-4daf-a2fc-040890054555" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.239017 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="sg-core" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.239027 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="ceilometer-central-agent" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.239035 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" containerName="ceilometer-notification-agent" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.239042 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5" containerName="mariadb-database-create" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.249503 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.253327 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-nova-dockercfg-cl5s2\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.253513 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-scripts\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.253633 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-config-data\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.263226 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c42lw"] Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.296206 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-config-data\") pod \"05b1b251-bdcc-4116-8606-ce6d963ca918\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.296257 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-sg-core-conf-yaml\") pod \"05b1b251-bdcc-4116-8606-ce6d963ca918\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.296498 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-log-httpd\") pod \"05b1b251-bdcc-4116-8606-ce6d963ca918\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.296669 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-scripts\") pod \"05b1b251-bdcc-4116-8606-ce6d963ca918\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.296799 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twx9z\" (UniqueName: \"kubernetes.io/projected/05b1b251-bdcc-4116-8606-ce6d963ca918-kube-api-access-twx9z\") pod \"05b1b251-bdcc-4116-8606-ce6d963ca918\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.296929 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-run-httpd\") pod \"05b1b251-bdcc-4116-8606-ce6d963ca918\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.296945 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-combined-ca-bundle\") pod \"05b1b251-bdcc-4116-8606-ce6d963ca918\" (UID: \"05b1b251-bdcc-4116-8606-ce6d963ca918\") " Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.298795 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "05b1b251-bdcc-4116-8606-ce6d963ca918" (UID: "05b1b251-bdcc-4116-8606-ce6d963ca918"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.299093 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "05b1b251-bdcc-4116-8606-ce6d963ca918" (UID: "05b1b251-bdcc-4116-8606-ce6d963ca918"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.304041 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b1b251-bdcc-4116-8606-ce6d963ca918-kube-api-access-twx9z" (OuterVolumeSpecName: "kube-api-access-twx9z") pod "05b1b251-bdcc-4116-8606-ce6d963ca918" (UID: "05b1b251-bdcc-4116-8606-ce6d963ca918"). InnerVolumeSpecName "kube-api-access-twx9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.308200 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-scripts" (OuterVolumeSpecName: "scripts") pod "05b1b251-bdcc-4116-8606-ce6d963ca918" (UID: "05b1b251-bdcc-4116-8606-ce6d963ca918"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.328335 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "05b1b251-bdcc-4116-8606-ce6d963ca918" (UID: "05b1b251-bdcc-4116-8606-ce6d963ca918"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.401036 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-scripts\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.401117 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-config-data\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.401155 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65fs9\" (UniqueName: \"kubernetes.io/projected/34c6bcc3-5315-43f6-9765-408aaa9e882f-kube-api-access-65fs9\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.401182 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.401345 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.401360 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.401371 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05b1b251-bdcc-4116-8606-ce6d963ca918-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.401385 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.402309 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twx9z\" (UniqueName: \"kubernetes.io/projected/05b1b251-bdcc-4116-8606-ce6d963ca918-kube-api-access-twx9z\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.417311 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05b1b251-bdcc-4116-8606-ce6d963ca918" (UID: "05b1b251-bdcc-4116-8606-ce6d963ca918"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.437954 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-config-data" (OuterVolumeSpecName: "config-data") pod "05b1b251-bdcc-4116-8606-ce6d963ca918" (UID: "05b1b251-bdcc-4116-8606-ce6d963ca918"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.504787 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-scripts\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.504858 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-config-data\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.504888 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65fs9\" (UniqueName: \"kubernetes.io/projected/34c6bcc3-5315-43f6-9765-408aaa9e882f-kube-api-access-65fs9\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.504926 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.505070 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.505084 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b1b251-bdcc-4116-8606-ce6d963ca918-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.509261 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.509492 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-scripts\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.509786 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-config-data\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.526388 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65fs9\" (UniqueName: \"kubernetes.io/projected/34c6bcc3-5315-43f6-9765-408aaa9e882f-kube-api-access-65fs9\") pod \"nova-cell0-conductor-db-sync-c42lw\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:47 crc kubenswrapper[5200]: I1126 13:52:47.581671 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.074893 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05b1b251-bdcc-4116-8606-ce6d963ca918","Type":"ContainerDied","Data":"7d2fc39dca2de13397a14ca09726d07f9982ad96fa78e3d09162ddf73c7c8dd1"} Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.076633 5200 scope.go:117] "RemoveContainer" containerID="aee7031d190bd100dde3fc076d32450de4117619575c422244a934bfe2ab3ae5" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.075177 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.108098 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c42lw"] Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.111417 5200 scope.go:117] "RemoveContainer" containerID="7e02c095fa5f94d68944d6e6025e4176a5f901bb556cf67a3c9b8af2661931ad" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.122241 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:48 crc kubenswrapper[5200]: W1126 13:52:48.137426 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34c6bcc3_5315_43f6_9765_408aaa9e882f.slice/crio-6556083f17bfb863453fbbb9f07be8c22e019f799e755c49e44dde85f1579d68 WatchSource:0}: Error finding container 6556083f17bfb863453fbbb9f07be8c22e019f799e755c49e44dde85f1579d68: Status 404 returned error can't find the container with id 6556083f17bfb863453fbbb9f07be8c22e019f799e755c49e44dde85f1579d68 Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.137961 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.163428 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.191233 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.191454 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.194318 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.194402 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.199724 5200 scope.go:117] "RemoveContainer" containerID="14472cb1362fed56614abd2f3cfbc49c72f1d300d9597558e3d6810ed9a5890b" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.234962 5200 scope.go:117] "RemoveContainer" containerID="ba732d56e77d5e6b2577e3ec7571200684e3ec61d202b05566fabb682a9c0938" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.334335 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-scripts\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.334456 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-log-httpd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.334748 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-config-data\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.334832 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.335043 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.335408 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrtkd\" (UniqueName: \"kubernetes.io/projected/43f0d583-75fa-4937-90ed-a4a7e564c081-kube-api-access-vrtkd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.335549 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-run-httpd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438036 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-log-httpd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438109 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-config-data\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438127 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438191 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438254 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrtkd\" (UniqueName: \"kubernetes.io/projected/43f0d583-75fa-4937-90ed-a4a7e564c081-kube-api-access-vrtkd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438282 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-run-httpd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438316 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-scripts\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438804 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-log-httpd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.438876 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-run-httpd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.446311 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-config-data\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.448037 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-scripts\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.449650 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.455576 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrtkd\" (UniqueName: \"kubernetes.io/projected/43f0d583-75fa-4937-90ed-a4a7e564c081-kube-api-access-vrtkd\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.462150 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " pod="openstack/ceilometer-0" Nov 26 13:52:48 crc kubenswrapper[5200]: I1126 13:52:48.515813 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:52:49 crc kubenswrapper[5200]: I1126 13:52:49.009467 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b1b251-bdcc-4116-8606-ce6d963ca918" path="/var/lib/kubelet/pods/05b1b251-bdcc-4116-8606-ce6d963ca918/volumes" Nov 26 13:52:49 crc kubenswrapper[5200]: W1126 13:52:49.052649 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43f0d583_75fa_4937_90ed_a4a7e564c081.slice/crio-076f8accd7c27d6460e80f1a86aaf901223b7fcfd4049d5773aa23ac95b8fd2a WatchSource:0}: Error finding container 076f8accd7c27d6460e80f1a86aaf901223b7fcfd4049d5773aa23ac95b8fd2a: Status 404 returned error can't find the container with id 076f8accd7c27d6460e80f1a86aaf901223b7fcfd4049d5773aa23ac95b8fd2a Nov 26 13:52:49 crc kubenswrapper[5200]: I1126 13:52:49.063894 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:52:49 crc kubenswrapper[5200]: I1126 13:52:49.090186 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c42lw" event={"ID":"34c6bcc3-5315-43f6-9765-408aaa9e882f","Type":"ContainerStarted","Data":"6556083f17bfb863453fbbb9f07be8c22e019f799e755c49e44dde85f1579d68"} Nov 26 13:52:49 crc kubenswrapper[5200]: I1126 13:52:49.094340 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerStarted","Data":"076f8accd7c27d6460e80f1a86aaf901223b7fcfd4049d5773aa23ac95b8fd2a"} Nov 26 13:52:50 crc kubenswrapper[5200]: I1126 13:52:50.137190 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerStarted","Data":"0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465"} Nov 26 13:52:51 crc kubenswrapper[5200]: I1126 13:52:51.150237 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerStarted","Data":"c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519"} Nov 26 13:52:56 crc kubenswrapper[5200]: I1126 13:52:56.213067 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerStarted","Data":"8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2"} Nov 26 13:52:56 crc kubenswrapper[5200]: I1126 13:52:56.218776 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c42lw" event={"ID":"34c6bcc3-5315-43f6-9765-408aaa9e882f","Type":"ContainerStarted","Data":"b9bea9b3d60d49f0f23ce4d1f8f0a5cdbb34657ba354b353d520cedd7760db3f"} Nov 26 13:52:58 crc kubenswrapper[5200]: I1126 13:52:58.249359 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerStarted","Data":"5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292"} Nov 26 13:52:58 crc kubenswrapper[5200]: I1126 13:52:58.250811 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 13:52:58 crc kubenswrapper[5200]: I1126 13:52:58.294829 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-c42lw" podStartSLOduration=4.031075493 podStartE2EDuration="11.294805257s" podCreationTimestamp="2025-11-26 13:52:47 +0000 UTC" firstStartedPulling="2025-11-26 13:52:48.142061412 +0000 UTC m=+1336.821263240" lastFinishedPulling="2025-11-26 13:52:55.405791186 +0000 UTC m=+1344.084993004" observedRunningTime="2025-11-26 13:52:56.242953334 +0000 UTC m=+1344.922155152" watchObservedRunningTime="2025-11-26 13:52:58.294805257 +0000 UTC m=+1346.974007085" Nov 26 13:52:58 crc kubenswrapper[5200]: I1126 13:52:58.297410 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8818828459999999 podStartE2EDuration="10.297402509s" podCreationTimestamp="2025-11-26 13:52:48 +0000 UTC" firstStartedPulling="2025-11-26 13:52:49.055134777 +0000 UTC m=+1337.734336605" lastFinishedPulling="2025-11-26 13:52:57.47065442 +0000 UTC m=+1346.149856268" observedRunningTime="2025-11-26 13:52:58.286912558 +0000 UTC m=+1346.966114406" watchObservedRunningTime="2025-11-26 13:52:58.297402509 +0000 UTC m=+1346.976604337" Nov 26 13:53:07 crc kubenswrapper[5200]: I1126 13:53:07.390283 5200 generic.go:358] "Generic (PLEG): container finished" podID="34c6bcc3-5315-43f6-9765-408aaa9e882f" containerID="b9bea9b3d60d49f0f23ce4d1f8f0a5cdbb34657ba354b353d520cedd7760db3f" exitCode=0 Nov 26 13:53:07 crc kubenswrapper[5200]: I1126 13:53:07.390403 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c42lw" event={"ID":"34c6bcc3-5315-43f6-9765-408aaa9e882f","Type":"ContainerDied","Data":"b9bea9b3d60d49f0f23ce4d1f8f0a5cdbb34657ba354b353d520cedd7760db3f"} Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.753564 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.795723 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-scripts\") pod \"34c6bcc3-5315-43f6-9765-408aaa9e882f\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.795909 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-config-data\") pod \"34c6bcc3-5315-43f6-9765-408aaa9e882f\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.795966 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65fs9\" (UniqueName: \"kubernetes.io/projected/34c6bcc3-5315-43f6-9765-408aaa9e882f-kube-api-access-65fs9\") pod \"34c6bcc3-5315-43f6-9765-408aaa9e882f\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.796023 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-combined-ca-bundle\") pod \"34c6bcc3-5315-43f6-9765-408aaa9e882f\" (UID: \"34c6bcc3-5315-43f6-9765-408aaa9e882f\") " Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.807902 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-scripts" (OuterVolumeSpecName: "scripts") pod "34c6bcc3-5315-43f6-9765-408aaa9e882f" (UID: "34c6bcc3-5315-43f6-9765-408aaa9e882f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.808637 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c6bcc3-5315-43f6-9765-408aaa9e882f-kube-api-access-65fs9" (OuterVolumeSpecName: "kube-api-access-65fs9") pod "34c6bcc3-5315-43f6-9765-408aaa9e882f" (UID: "34c6bcc3-5315-43f6-9765-408aaa9e882f"). InnerVolumeSpecName "kube-api-access-65fs9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.827360 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34c6bcc3-5315-43f6-9765-408aaa9e882f" (UID: "34c6bcc3-5315-43f6-9765-408aaa9e882f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.843601 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-config-data" (OuterVolumeSpecName: "config-data") pod "34c6bcc3-5315-43f6-9765-408aaa9e882f" (UID: "34c6bcc3-5315-43f6-9765-408aaa9e882f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.899576 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.899615 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.899625 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65fs9\" (UniqueName: \"kubernetes.io/projected/34c6bcc3-5315-43f6-9765-408aaa9e882f-kube-api-access-65fs9\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:08 crc kubenswrapper[5200]: I1126 13:53:08.899634 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c6bcc3-5315-43f6-9765-408aaa9e882f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.435804 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c42lw" event={"ID":"34c6bcc3-5315-43f6-9765-408aaa9e882f","Type":"ContainerDied","Data":"6556083f17bfb863453fbbb9f07be8c22e019f799e755c49e44dde85f1579d68"} Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.435858 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6556083f17bfb863453fbbb9f07be8c22e019f799e755c49e44dde85f1579d68" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.435944 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c42lw" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.559192 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.561049 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34c6bcc3-5315-43f6-9765-408aaa9e882f" containerName="nova-cell0-conductor-db-sync" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.561075 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c6bcc3-5315-43f6-9765-408aaa9e882f" containerName="nova-cell0-conductor-db-sync" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.561317 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="34c6bcc3-5315-43f6-9765-408aaa9e882f" containerName="nova-cell0-conductor-db-sync" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.597155 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.597357 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.601408 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-config-data\"" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.601658 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-nova-dockercfg-cl5s2\"" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.621795 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.621898 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.621984 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z9qn\" (UniqueName: \"kubernetes.io/projected/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-kube-api-access-4z9qn\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.723554 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.723647 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9qn\" (UniqueName: \"kubernetes.io/projected/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-kube-api-access-4z9qn\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.723756 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.728714 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.733242 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.746512 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9qn\" (UniqueName: \"kubernetes.io/projected/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-kube-api-access-4z9qn\") pod \"nova-cell0-conductor-0\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:09 crc kubenswrapper[5200]: I1126 13:53:09.929851 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:10 crc kubenswrapper[5200]: I1126 13:53:10.404756 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 13:53:10 crc kubenswrapper[5200]: W1126 13:53:10.412608 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8edf3f05_24ba_41fa_82bd_3a4e57a1cee6.slice/crio-83dc709ebbea8111c9b9ced4f84a4d02d0ffc9256e56133973446b3f6feb457e WatchSource:0}: Error finding container 83dc709ebbea8111c9b9ced4f84a4d02d0ffc9256e56133973446b3f6feb457e: Status 404 returned error can't find the container with id 83dc709ebbea8111c9b9ced4f84a4d02d0ffc9256e56133973446b3f6feb457e Nov 26 13:53:10 crc kubenswrapper[5200]: I1126 13:53:10.449408 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6","Type":"ContainerStarted","Data":"83dc709ebbea8111c9b9ced4f84a4d02d0ffc9256e56133973446b3f6feb457e"} Nov 26 13:53:11 crc kubenswrapper[5200]: I1126 13:53:11.465322 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6","Type":"ContainerStarted","Data":"30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851"} Nov 26 13:53:11 crc kubenswrapper[5200]: I1126 13:53:11.508445 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.508415568 podStartE2EDuration="2.508415568s" podCreationTimestamp="2025-11-26 13:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:11.496280329 +0000 UTC m=+1360.175482147" watchObservedRunningTime="2025-11-26 13:53:11.508415568 +0000 UTC m=+1360.187617376" Nov 26 13:53:12 crc kubenswrapper[5200]: I1126 13:53:12.478053 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:18 crc kubenswrapper[5200]: I1126 13:53:18.704929 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.176443 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pq5lt"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.195670 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.200322 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-manage-scripts\"" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.216244 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-manage-config-data\"" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.225437 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pq5lt"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.274201 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-config-data\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.274270 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhf5\" (UniqueName: \"kubernetes.io/projected/3f93568e-5ff5-4a13-af0f-67dddc7615c1-kube-api-access-qbhf5\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.274392 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-scripts\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.274471 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.376259 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-config-data\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.376343 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhf5\" (UniqueName: \"kubernetes.io/projected/3f93568e-5ff5-4a13-af0f-67dddc7615c1-kube-api-access-qbhf5\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.376408 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-scripts\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.376465 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.395707 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-scripts\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.396106 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-config-data\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.396402 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.412595 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhf5\" (UniqueName: \"kubernetes.io/projected/3f93568e-5ff5-4a13-af0f-67dddc7615c1-kube-api-access-qbhf5\") pod \"nova-cell0-cell-mapping-pq5lt\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.489442 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.504051 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.513084 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.536343 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.547002 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.550627 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.560498 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.577744 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.582084 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7198255-2c57-440a-aafe-ac9fceb5889a-logs\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.582180 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-config-data\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.582216 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.582248 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc2hg\" (UniqueName: \"kubernetes.io/projected/d7198255-2c57-440a-aafe-ac9fceb5889a-kube-api-access-pc2hg\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.673767 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.749161 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmnbz\" (UniqueName: \"kubernetes.io/projected/da7ec657-e265-435a-a731-754cd19f9072-kube-api-access-rmnbz\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.749349 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7198255-2c57-440a-aafe-ac9fceb5889a-logs\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.749475 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.749601 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-config-data\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.749667 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.749742 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-config-data\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.749789 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc2hg\" (UniqueName: \"kubernetes.io/projected/d7198255-2c57-440a-aafe-ac9fceb5889a-kube-api-access-pc2hg\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.750848 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7198255-2c57-440a-aafe-ac9fceb5889a-logs\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.763883 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-config-data\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.774693 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.784570 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.833201 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc2hg\" (UniqueName: \"kubernetes.io/projected/d7198255-2c57-440a-aafe-ac9fceb5889a-kube-api-access-pc2hg\") pod \"nova-api-0\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.834307 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.851694 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-config-data\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.851796 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmnbz\" (UniqueName: \"kubernetes.io/projected/da7ec657-e265-435a-a731-754cd19f9072-kube-api-access-rmnbz\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.851871 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.857898 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.861163 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.866627 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-config-data\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.877283 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.890381 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.896338 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmnbz\" (UniqueName: \"kubernetes.io/projected/da7ec657-e265-435a-a731-754cd19f9072-kube-api-access-rmnbz\") pod \"nova-scheduler-0\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.950869 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.955928 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.956011 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0329d704-9dd2-403c-87b1-73794a949de0-logs\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.956030 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w29n\" (UniqueName: \"kubernetes.io/projected/0329d704-9dd2-403c-87b1-73794a949de0-kube-api-access-2w29n\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.956050 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-config-data\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.976924 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.987583 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-novncproxy-config-data\"" Nov 26 13:53:19 crc kubenswrapper[5200]: I1126 13:53:19.994095 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.019873 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59f7db47d5-mbc9q"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.030670 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.032078 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.060224 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.060327 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0329d704-9dd2-403c-87b1-73794a949de0-logs\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.060355 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w29n\" (UniqueName: \"kubernetes.io/projected/0329d704-9dd2-403c-87b1-73794a949de0-kube-api-access-2w29n\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.060376 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-config-data\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.064229 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0329d704-9dd2-403c-87b1-73794a949de0-logs\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.070144 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59f7db47d5-mbc9q"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.071698 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.074543 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-config-data\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.104471 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w29n\" (UniqueName: \"kubernetes.io/projected/0329d704-9dd2-403c-87b1-73794a949de0-kube-api-access-2w29n\") pod \"nova-metadata-0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.163493 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8ldc\" (UniqueName: \"kubernetes.io/projected/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-kube-api-access-f8ldc\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.163553 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.163734 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-sb\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.163752 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n6jj\" (UniqueName: \"kubernetes.io/projected/21e48d05-59f9-42a8-bc88-60ade1961efb-kube-api-access-2n6jj\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.163792 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.163821 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-svc\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.163854 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-config\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.165160 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-nb\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.165221 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-swift-storage-0\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.232427 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.267980 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-sb\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.268173 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n6jj\" (UniqueName: \"kubernetes.io/projected/21e48d05-59f9-42a8-bc88-60ade1961efb-kube-api-access-2n6jj\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.268297 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.268419 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-svc\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.268521 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-config\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.268626 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-nb\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.268730 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-swift-storage-0\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.268830 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8ldc\" (UniqueName: \"kubernetes.io/projected/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-kube-api-access-f8ldc\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.268914 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.269958 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-config\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.270527 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-svc\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.271227 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-swift-storage-0\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.271965 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-nb\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.272533 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-sb\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.274215 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.278266 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.292710 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8ldc\" (UniqueName: \"kubernetes.io/projected/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-kube-api-access-f8ldc\") pod \"dnsmasq-dns-59f7db47d5-mbc9q\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.297114 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n6jj\" (UniqueName: \"kubernetes.io/projected/21e48d05-59f9-42a8-bc88-60ade1961efb-kube-api-access-2n6jj\") pod \"nova-cell1-novncproxy-0\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.303554 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.377799 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.520282 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pq5lt"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.621862 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7bdlq"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.647739 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.650772 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-scripts\"" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.651253 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-config-data\"" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.657060 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7bdlq"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.737792 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:20 crc kubenswrapper[5200]: W1126 13:53:20.746703 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda7ec657_e265_435a_a731_754cd19f9072.slice/crio-06dd8730020af08e0c2a1952c274a87f13c21b8223c8d066451001ebdd12eba8 WatchSource:0}: Error finding container 06dd8730020af08e0c2a1952c274a87f13c21b8223c8d066451001ebdd12eba8: Status 404 returned error can't find the container with id 06dd8730020af08e0c2a1952c274a87f13c21b8223c8d066451001ebdd12eba8 Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.747496 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.784433 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-config-data\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.784609 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-scripts\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.784642 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxn4g\" (UniqueName: \"kubernetes.io/projected/721e7087-5753-4216-9598-3497ce8f92a9-kube-api-access-dxn4g\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.784732 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.801037 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7198255-2c57-440a-aafe-ac9fceb5889a","Type":"ContainerStarted","Data":"4f9388ade7708d7af423a2f70499c0cd4a4dbc01600851ef89a775ae7b9c521f"} Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.802523 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da7ec657-e265-435a-a731-754cd19f9072","Type":"ContainerStarted","Data":"06dd8730020af08e0c2a1952c274a87f13c21b8223c8d066451001ebdd12eba8"} Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.805514 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pq5lt" event={"ID":"3f93568e-5ff5-4a13-af0f-67dddc7615c1","Type":"ContainerStarted","Data":"dfb6a769c45a35f54dca88543e258006611a74a0939830b3108dec9b41f8351b"} Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.880580 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.886967 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-scripts\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.888106 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxn4g\" (UniqueName: \"kubernetes.io/projected/721e7087-5753-4216-9598-3497ce8f92a9-kube-api-access-dxn4g\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.888184 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.888257 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-config-data\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.910050 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.911303 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxn4g\" (UniqueName: \"kubernetes.io/projected/721e7087-5753-4216-9598-3497ce8f92a9-kube-api-access-dxn4g\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.914060 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-scripts\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.914643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-config-data\") pod \"nova-cell1-conductor-db-sync-7bdlq\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:20 crc kubenswrapper[5200]: W1126 13:53:20.948445 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e48d05_59f9_42a8_bc88_60ade1961efb.slice/crio-e0aafcfd182b7b078de36ab8f11f86a3b865e31c577a176e9f478f4f8c6ea01a WatchSource:0}: Error finding container e0aafcfd182b7b078de36ab8f11f86a3b865e31c577a176e9f478f4f8c6ea01a: Status 404 returned error can't find the container with id e0aafcfd182b7b078de36ab8f11f86a3b865e31c577a176e9f478f4f8c6ea01a Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.949679 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:20 crc kubenswrapper[5200]: I1126 13:53:20.972802 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.056027 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59f7db47d5-mbc9q"] Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.550342 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7bdlq"] Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.826696 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0329d704-9dd2-403c-87b1-73794a949de0","Type":"ContainerStarted","Data":"f6f468d5698438d3332199dcd5667b5c9e60ea0bb19bc5ebb9df2b1f71a95f7e"} Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.833964 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21e48d05-59f9-42a8-bc88-60ade1961efb","Type":"ContainerStarted","Data":"e0aafcfd182b7b078de36ab8f11f86a3b865e31c577a176e9f478f4f8c6ea01a"} Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.837381 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pq5lt" event={"ID":"3f93568e-5ff5-4a13-af0f-67dddc7615c1","Type":"ContainerStarted","Data":"1afaf085c089cd36284f090e60b9e0ede6edf0fcbb9d8f0be71bde2698693e43"} Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.846739 5200 generic.go:358] "Generic (PLEG): container finished" podID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" containerID="7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b" exitCode=0 Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.846865 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" event={"ID":"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6","Type":"ContainerDied","Data":"7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b"} Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.846891 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" event={"ID":"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6","Type":"ContainerStarted","Data":"cec617543077dcadca6beb0fc285732fa9f9b5ce5ffd1192757be46d5af7bbc4"} Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.858174 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" event={"ID":"721e7087-5753-4216-9598-3497ce8f92a9","Type":"ContainerStarted","Data":"956a998449ecc48a38f87187a59a5006a55471ce6a6e8f275446aefd62347bb0"} Nov 26 13:53:21 crc kubenswrapper[5200]: I1126 13:53:21.866263 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pq5lt" podStartSLOduration=2.866237322 podStartE2EDuration="2.866237322s" podCreationTimestamp="2025-11-26 13:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:21.854676228 +0000 UTC m=+1370.533878046" watchObservedRunningTime="2025-11-26 13:53:21.866237322 +0000 UTC m=+1370.545439140" Nov 26 13:53:22 crc kubenswrapper[5200]: I1126 13:53:22.878793 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" event={"ID":"721e7087-5753-4216-9598-3497ce8f92a9","Type":"ContainerStarted","Data":"68b0094d24ce9ed64996fa8a71a98c96dd392c95b50b7a7799481cf8fdd85c51"} Nov 26 13:53:22 crc kubenswrapper[5200]: I1126 13:53:22.897919 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" podStartSLOduration=2.897897786 podStartE2EDuration="2.897897786s" podCreationTimestamp="2025-11-26 13:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:22.895914166 +0000 UTC m=+1371.575115994" watchObservedRunningTime="2025-11-26 13:53:22.897897786 +0000 UTC m=+1371.577099604" Nov 26 13:53:23 crc kubenswrapper[5200]: I1126 13:53:23.898532 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" event={"ID":"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6","Type":"ContainerStarted","Data":"5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451"} Nov 26 13:53:23 crc kubenswrapper[5200]: I1126 13:53:23.899078 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:23 crc kubenswrapper[5200]: I1126 13:53:23.937129 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" podStartSLOduration=4.937105282 podStartE2EDuration="4.937105282s" podCreationTimestamp="2025-11-26 13:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:23.932171556 +0000 UTC m=+1372.611373374" watchObservedRunningTime="2025-11-26 13:53:23.937105282 +0000 UTC m=+1372.616307100" Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.017655 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.041844 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.931965 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7198255-2c57-440a-aafe-ac9fceb5889a","Type":"ContainerStarted","Data":"419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b"} Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.935529 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7198255-2c57-440a-aafe-ac9fceb5889a","Type":"ContainerStarted","Data":"14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399"} Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.940443 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0329d704-9dd2-403c-87b1-73794a949de0","Type":"ContainerStarted","Data":"c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d"} Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.940597 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0329d704-9dd2-403c-87b1-73794a949de0","Type":"ContainerStarted","Data":"e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9"} Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.940836 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0329d704-9dd2-403c-87b1-73794a949de0" containerName="nova-metadata-log" containerID="cri-o://e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9" gracePeriod=30 Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.941227 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0329d704-9dd2-403c-87b1-73794a949de0" containerName="nova-metadata-metadata" containerID="cri-o://c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d" gracePeriod=30 Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.951380 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.437258848 podStartE2EDuration="5.951360222s" podCreationTimestamp="2025-11-26 13:53:19 +0000 UTC" firstStartedPulling="2025-11-26 13:53:20.750992321 +0000 UTC m=+1369.430194149" lastFinishedPulling="2025-11-26 13:53:24.265093705 +0000 UTC m=+1372.944295523" observedRunningTime="2025-11-26 13:53:24.950998353 +0000 UTC m=+1373.630200171" watchObservedRunningTime="2025-11-26 13:53:24.951360222 +0000 UTC m=+1373.630562040" Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.955544 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da7ec657-e265-435a-a731-754cd19f9072","Type":"ContainerStarted","Data":"56c3a4910d1c067d24414886aceaa84790ac52c9a3004a8ce510774092fd40d6"} Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.971419 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="21e48d05-59f9-42a8-bc88-60ade1961efb" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f" gracePeriod=30 Nov 26 13:53:24 crc kubenswrapper[5200]: I1126 13:53:24.988773 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.617638413 podStartE2EDuration="5.988758695s" podCreationTimestamp="2025-11-26 13:53:19 +0000 UTC" firstStartedPulling="2025-11-26 13:53:20.879448953 +0000 UTC m=+1369.558650771" lastFinishedPulling="2025-11-26 13:53:24.250569235 +0000 UTC m=+1372.929771053" observedRunningTime="2025-11-26 13:53:24.974227785 +0000 UTC m=+1373.653429593" watchObservedRunningTime="2025-11-26 13:53:24.988758695 +0000 UTC m=+1373.667960513" Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.019499 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.519747579 podStartE2EDuration="6.019467267s" podCreationTimestamp="2025-11-26 13:53:19 +0000 UTC" firstStartedPulling="2025-11-26 13:53:20.752222792 +0000 UTC m=+1369.431424610" lastFinishedPulling="2025-11-26 13:53:24.25194247 +0000 UTC m=+1372.931144298" observedRunningTime="2025-11-26 13:53:25.008753794 +0000 UTC m=+1373.687955612" watchObservedRunningTime="2025-11-26 13:53:25.019467267 +0000 UTC m=+1373.698669085" Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.021664 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21e48d05-59f9-42a8-bc88-60ade1961efb","Type":"ContainerStarted","Data":"d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f"} Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.031568 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.037096 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.737048673 podStartE2EDuration="6.037072205s" podCreationTimestamp="2025-11-26 13:53:19 +0000 UTC" firstStartedPulling="2025-11-26 13:53:20.951821186 +0000 UTC m=+1369.631023004" lastFinishedPulling="2025-11-26 13:53:24.251844718 +0000 UTC m=+1372.931046536" observedRunningTime="2025-11-26 13:53:25.033047733 +0000 UTC m=+1373.712249561" watchObservedRunningTime="2025-11-26 13:53:25.037072205 +0000 UTC m=+1373.716274013" Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.233023 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.233094 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.306855 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.989715 5200 generic.go:358] "Generic (PLEG): container finished" podID="0329d704-9dd2-403c-87b1-73794a949de0" containerID="e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9" exitCode=143 Nov 26 13:53:25 crc kubenswrapper[5200]: I1126 13:53:25.989902 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0329d704-9dd2-403c-87b1-73794a949de0","Type":"ContainerDied","Data":"e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9"} Nov 26 13:53:29 crc kubenswrapper[5200]: I1126 13:53:29.284202 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 13:53:29 crc kubenswrapper[5200]: I1126 13:53:29.835579 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 13:53:29 crc kubenswrapper[5200]: I1126 13:53:29.836080 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 13:53:29 crc kubenswrapper[5200]: I1126 13:53:29.973898 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.032176 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.066503 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54db795db7-2zrdk"] Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.067019 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" podUID="13b63abc-53cc-433b-8635-2cc87fed8052" containerName="dnsmasq-dns" containerID="cri-o://9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5" gracePeriod=10 Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.092903 5200 generic.go:358] "Generic (PLEG): container finished" podID="3f93568e-5ff5-4a13-af0f-67dddc7615c1" containerID="1afaf085c089cd36284f090e60b9e0ede6edf0fcbb9d8f0be71bde2698693e43" exitCode=0 Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.093260 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pq5lt" event={"ID":"3f93568e-5ff5-4a13-af0f-67dddc7615c1","Type":"ContainerDied","Data":"1afaf085c089cd36284f090e60b9e0ede6edf0fcbb9d8f0be71bde2698693e43"} Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.093438 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.636732 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.755521 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-swift-storage-0\") pod \"13b63abc-53cc-433b-8635-2cc87fed8052\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.755794 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-nb\") pod \"13b63abc-53cc-433b-8635-2cc87fed8052\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.755853 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-svc\") pod \"13b63abc-53cc-433b-8635-2cc87fed8052\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.755902 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j6qf\" (UniqueName: \"kubernetes.io/projected/13b63abc-53cc-433b-8635-2cc87fed8052-kube-api-access-6j6qf\") pod \"13b63abc-53cc-433b-8635-2cc87fed8052\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.755953 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-sb\") pod \"13b63abc-53cc-433b-8635-2cc87fed8052\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.756095 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-config\") pod \"13b63abc-53cc-433b-8635-2cc87fed8052\" (UID: \"13b63abc-53cc-433b-8635-2cc87fed8052\") " Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.764609 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b63abc-53cc-433b-8635-2cc87fed8052-kube-api-access-6j6qf" (OuterVolumeSpecName: "kube-api-access-6j6qf") pod "13b63abc-53cc-433b-8635-2cc87fed8052" (UID: "13b63abc-53cc-433b-8635-2cc87fed8052"). InnerVolumeSpecName "kube-api-access-6j6qf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.834654 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "13b63abc-53cc-433b-8635-2cc87fed8052" (UID: "13b63abc-53cc-433b-8635-2cc87fed8052"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.835568 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-config" (OuterVolumeSpecName: "config") pod "13b63abc-53cc-433b-8635-2cc87fed8052" (UID: "13b63abc-53cc-433b-8635-2cc87fed8052"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.836055 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.845393 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13b63abc-53cc-433b-8635-2cc87fed8052" (UID: "13b63abc-53cc-433b-8635-2cc87fed8052"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.850299 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13b63abc-53cc-433b-8635-2cc87fed8052" (UID: "13b63abc-53cc-433b-8635-2cc87fed8052"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.858310 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.858349 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.858365 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.858377 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6j6qf\" (UniqueName: \"kubernetes.io/projected/13b63abc-53cc-433b-8635-2cc87fed8052-kube-api-access-6j6qf\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.858392 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.876959 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.891155 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13b63abc-53cc-433b-8635-2cc87fed8052" (UID: "13b63abc-53cc-433b-8635-2cc87fed8052"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:53:30 crc kubenswrapper[5200]: I1126 13:53:30.961564 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13b63abc-53cc-433b-8635-2cc87fed8052-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.122299 5200 generic.go:358] "Generic (PLEG): container finished" podID="13b63abc-53cc-433b-8635-2cc87fed8052" containerID="9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5" exitCode=0 Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.123645 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.123788 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" event={"ID":"13b63abc-53cc-433b-8635-2cc87fed8052","Type":"ContainerDied","Data":"9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5"} Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.123874 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54db795db7-2zrdk" event={"ID":"13b63abc-53cc-433b-8635-2cc87fed8052","Type":"ContainerDied","Data":"43051443c94310af51f1987fda355e6d1a47e1f7d30e36e600d1c14cfb629cc2"} Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.123904 5200 scope.go:117] "RemoveContainer" containerID="9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.202781 5200 scope.go:117] "RemoveContainer" containerID="337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.205747 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54db795db7-2zrdk"] Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.206963 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.234797 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54db795db7-2zrdk"] Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.308582 5200 scope.go:117] "RemoveContainer" containerID="9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5" Nov 26 13:53:31 crc kubenswrapper[5200]: E1126 13:53:31.312830 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5\": container with ID starting with 9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5 not found: ID does not exist" containerID="9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.312884 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5"} err="failed to get container status \"9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5\": rpc error: code = NotFound desc = could not find container \"9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5\": container with ID starting with 9f4d4b33ea64a6d701e3ce4b87b87a76ff98ad83d8d8840f529179fe310a8cf5 not found: ID does not exist" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.312915 5200 scope.go:117] "RemoveContainer" containerID="337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650" Nov 26 13:53:31 crc kubenswrapper[5200]: E1126 13:53:31.316081 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650\": container with ID starting with 337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650 not found: ID does not exist" containerID="337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.316148 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650"} err="failed to get container status \"337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650\": rpc error: code = NotFound desc = could not find container \"337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650\": container with ID starting with 337844c1f796947c18ed223ffc2f3e0c71406f77e01f2b19030df76068b37650 not found: ID does not exist" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.696308 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.788940 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-config-data\") pod \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.789167 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbhf5\" (UniqueName: \"kubernetes.io/projected/3f93568e-5ff5-4a13-af0f-67dddc7615c1-kube-api-access-qbhf5\") pod \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.789218 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-scripts\") pod \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.789372 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-combined-ca-bundle\") pod \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\" (UID: \"3f93568e-5ff5-4a13-af0f-67dddc7615c1\") " Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.796812 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-scripts" (OuterVolumeSpecName: "scripts") pod "3f93568e-5ff5-4a13-af0f-67dddc7615c1" (UID: "3f93568e-5ff5-4a13-af0f-67dddc7615c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.821925 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f93568e-5ff5-4a13-af0f-67dddc7615c1-kube-api-access-qbhf5" (OuterVolumeSpecName: "kube-api-access-qbhf5") pod "3f93568e-5ff5-4a13-af0f-67dddc7615c1" (UID: "3f93568e-5ff5-4a13-af0f-67dddc7615c1"). InnerVolumeSpecName "kube-api-access-qbhf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.830194 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-config-data" (OuterVolumeSpecName: "config-data") pod "3f93568e-5ff5-4a13-af0f-67dddc7615c1" (UID: "3f93568e-5ff5-4a13-af0f-67dddc7615c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.848587 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f93568e-5ff5-4a13-af0f-67dddc7615c1" (UID: "3f93568e-5ff5-4a13-af0f-67dddc7615c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.892396 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.892440 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.892455 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbhf5\" (UniqueName: \"kubernetes.io/projected/3f93568e-5ff5-4a13-af0f-67dddc7615c1-kube-api-access-qbhf5\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:31 crc kubenswrapper[5200]: I1126 13:53:31.892469 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f93568e-5ff5-4a13-af0f-67dddc7615c1-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.135030 5200 generic.go:358] "Generic (PLEG): container finished" podID="721e7087-5753-4216-9598-3497ce8f92a9" containerID="68b0094d24ce9ed64996fa8a71a98c96dd392c95b50b7a7799481cf8fdd85c51" exitCode=0 Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.135079 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" event={"ID":"721e7087-5753-4216-9598-3497ce8f92a9","Type":"ContainerDied","Data":"68b0094d24ce9ed64996fa8a71a98c96dd392c95b50b7a7799481cf8fdd85c51"} Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.147914 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pq5lt" Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.148398 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pq5lt" event={"ID":"3f93568e-5ff5-4a13-af0f-67dddc7615c1","Type":"ContainerDied","Data":"dfb6a769c45a35f54dca88543e258006611a74a0939830b3108dec9b41f8351b"} Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.148440 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfb6a769c45a35f54dca88543e258006611a74a0939830b3108dec9b41f8351b" Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.295144 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.295496 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-api" containerID="cri-o://419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b" gracePeriod=30 Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.295448 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-log" containerID="cri-o://14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399" gracePeriod=30 Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.305394 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:32 crc kubenswrapper[5200]: I1126 13:53:32.992779 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13b63abc-53cc-433b-8635-2cc87fed8052" path="/var/lib/kubelet/pods/13b63abc-53cc-433b-8635-2cc87fed8052/volumes" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.155067 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.155161 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.159976 5200 generic.go:358] "Generic (PLEG): container finished" podID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerID="14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399" exitCode=143 Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.160064 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7198255-2c57-440a-aafe-ac9fceb5889a","Type":"ContainerDied","Data":"14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399"} Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.160659 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="da7ec657-e265-435a-a731-754cd19f9072" containerName="nova-scheduler-scheduler" containerID="cri-o://56c3a4910d1c067d24414886aceaa84790ac52c9a3004a8ce510774092fd40d6" gracePeriod=30 Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.553818 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.737634 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-combined-ca-bundle\") pod \"721e7087-5753-4216-9598-3497ce8f92a9\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.738109 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-config-data\") pod \"721e7087-5753-4216-9598-3497ce8f92a9\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.738156 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-scripts\") pod \"721e7087-5753-4216-9598-3497ce8f92a9\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.738331 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxn4g\" (UniqueName: \"kubernetes.io/projected/721e7087-5753-4216-9598-3497ce8f92a9-kube-api-access-dxn4g\") pod \"721e7087-5753-4216-9598-3497ce8f92a9\" (UID: \"721e7087-5753-4216-9598-3497ce8f92a9\") " Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.745407 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-scripts" (OuterVolumeSpecName: "scripts") pod "721e7087-5753-4216-9598-3497ce8f92a9" (UID: "721e7087-5753-4216-9598-3497ce8f92a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.745415 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/721e7087-5753-4216-9598-3497ce8f92a9-kube-api-access-dxn4g" (OuterVolumeSpecName: "kube-api-access-dxn4g") pod "721e7087-5753-4216-9598-3497ce8f92a9" (UID: "721e7087-5753-4216-9598-3497ce8f92a9"). InnerVolumeSpecName "kube-api-access-dxn4g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.776961 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-config-data" (OuterVolumeSpecName: "config-data") pod "721e7087-5753-4216-9598-3497ce8f92a9" (UID: "721e7087-5753-4216-9598-3497ce8f92a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.792563 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "721e7087-5753-4216-9598-3497ce8f92a9" (UID: "721e7087-5753-4216-9598-3497ce8f92a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.842399 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.842466 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.842478 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/721e7087-5753-4216-9598-3497ce8f92a9-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.842490 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dxn4g\" (UniqueName: \"kubernetes.io/projected/721e7087-5753-4216-9598-3497ce8f92a9-kube-api-access-dxn4g\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.999023 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:53:33 crc kubenswrapper[5200]: I1126 13:53:33.999370 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="20a99fd8-1db7-48a0-ad37-5652b13c7e00" containerName="kube-state-metrics" containerID="cri-o://2ce94c31dcd5188fa720de14e32b86ec905a5c470d231749cdd9917175af70e9" gracePeriod=30 Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.182131 5200 generic.go:358] "Generic (PLEG): container finished" podID="20a99fd8-1db7-48a0-ad37-5652b13c7e00" containerID="2ce94c31dcd5188fa720de14e32b86ec905a5c470d231749cdd9917175af70e9" exitCode=2 Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.182856 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"20a99fd8-1db7-48a0-ad37-5652b13c7e00","Type":"ContainerDied","Data":"2ce94c31dcd5188fa720de14e32b86ec905a5c470d231749cdd9917175af70e9"} Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.186287 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" event={"ID":"721e7087-5753-4216-9598-3497ce8f92a9","Type":"ContainerDied","Data":"956a998449ecc48a38f87187a59a5006a55471ce6a6e8f275446aefd62347bb0"} Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.186420 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="956a998449ecc48a38f87187a59a5006a55471ce6a6e8f275446aefd62347bb0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.186621 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7bdlq" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.253335 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.254704 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13b63abc-53cc-433b-8635-2cc87fed8052" containerName="init" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.254725 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b63abc-53cc-433b-8635-2cc87fed8052" containerName="init" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.254747 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f93568e-5ff5-4a13-af0f-67dddc7615c1" containerName="nova-manage" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.254754 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f93568e-5ff5-4a13-af0f-67dddc7615c1" containerName="nova-manage" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.254788 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="721e7087-5753-4216-9598-3497ce8f92a9" containerName="nova-cell1-conductor-db-sync" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.254796 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="721e7087-5753-4216-9598-3497ce8f92a9" containerName="nova-cell1-conductor-db-sync" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.254811 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13b63abc-53cc-433b-8635-2cc87fed8052" containerName="dnsmasq-dns" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.254817 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b63abc-53cc-433b-8635-2cc87fed8052" containerName="dnsmasq-dns" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.255004 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="13b63abc-53cc-433b-8635-2cc87fed8052" containerName="dnsmasq-dns" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.255021 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="721e7087-5753-4216-9598-3497ce8f92a9" containerName="nova-cell1-conductor-db-sync" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.255036 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f93568e-5ff5-4a13-af0f-67dddc7615c1" containerName="nova-manage" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.270188 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.272352 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.273946 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-config-data\"" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.357922 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.357993 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnfv\" (UniqueName: \"kubernetes.io/projected/aae8bd0a-01d7-497d-b7fb-26c495823af7-kube-api-access-ztnfv\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.358053 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.459673 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.459834 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnfv\" (UniqueName: \"kubernetes.io/projected/aae8bd0a-01d7-497d-b7fb-26c495823af7-kube-api-access-ztnfv\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.459863 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.469805 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.472634 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.489555 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnfv\" (UniqueName: \"kubernetes.io/projected/aae8bd0a-01d7-497d-b7fb-26c495823af7-kube-api-access-ztnfv\") pod \"nova-cell1-conductor-0\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.552801 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.561290 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jh9f\" (UniqueName: \"kubernetes.io/projected/20a99fd8-1db7-48a0-ad37-5652b13c7e00-kube-api-access-2jh9f\") pod \"20a99fd8-1db7-48a0-ad37-5652b13c7e00\" (UID: \"20a99fd8-1db7-48a0-ad37-5652b13c7e00\") " Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.565187 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a99fd8-1db7-48a0-ad37-5652b13c7e00-kube-api-access-2jh9f" (OuterVolumeSpecName: "kube-api-access-2jh9f") pod "20a99fd8-1db7-48a0-ad37-5652b13c7e00" (UID: "20a99fd8-1db7-48a0-ad37-5652b13c7e00"). InnerVolumeSpecName "kube-api-access-2jh9f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.591859 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:34 crc kubenswrapper[5200]: I1126 13:53:34.670512 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2jh9f\" (UniqueName: \"kubernetes.io/projected/20a99fd8-1db7-48a0-ad37-5652b13c7e00-kube-api-access-2jh9f\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.082227 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.218424 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"20a99fd8-1db7-48a0-ad37-5652b13c7e00","Type":"ContainerDied","Data":"f77ad4e3b078a74ae55351a8b70f4fa4f5e882e2ceff88b3b28eac508632a06e"} Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.218500 5200 scope.go:117] "RemoveContainer" containerID="2ce94c31dcd5188fa720de14e32b86ec905a5c470d231749cdd9917175af70e9" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.218703 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.232214 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aae8bd0a-01d7-497d-b7fb-26c495823af7","Type":"ContainerStarted","Data":"0a42a963ec31c8af8a7becd2d3868047ff5ae811d600960f93340e53053d2c31"} Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.284150 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.299000 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.331174 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.332413 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20a99fd8-1db7-48a0-ad37-5652b13c7e00" containerName="kube-state-metrics" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.332434 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a99fd8-1db7-48a0-ad37-5652b13c7e00" containerName="kube-state-metrics" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.332644 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="20a99fd8-1db7-48a0-ad37-5652b13c7e00" containerName="kube-state-metrics" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.348777 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.352561 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-kube-state-metrics-svc\"" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.352596 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"kube-state-metrics-tls-config\"" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.358757 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.394180 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.394246 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rxh4\" (UniqueName: \"kubernetes.io/projected/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-api-access-2rxh4\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.394272 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.394313 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.496512 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.496713 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.496752 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rxh4\" (UniqueName: \"kubernetes.io/projected/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-api-access-2rxh4\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.497772 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.502537 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.504400 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.505428 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.520431 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rxh4\" (UniqueName: \"kubernetes.io/projected/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-api-access-2rxh4\") pod \"kube-state-metrics-0\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " pod="openstack/kube-state-metrics-0" Nov 26 13:53:35 crc kubenswrapper[5200]: I1126 13:53:35.672047 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.043194 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.043994 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="ceilometer-central-agent" containerID="cri-o://0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465" gracePeriod=30 Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.044075 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="proxy-httpd" containerID="cri-o://5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292" gracePeriod=30 Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.044087 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="sg-core" containerID="cri-o://8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2" gracePeriod=30 Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.044145 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="ceilometer-notification-agent" containerID="cri-o://c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519" gracePeriod=30 Nov 26 13:53:36 crc kubenswrapper[5200]: E1126 13:53:36.134259 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56c3a4910d1c067d24414886aceaa84790ac52c9a3004a8ce510774092fd40d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:53:36 crc kubenswrapper[5200]: E1126 13:53:36.136309 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56c3a4910d1c067d24414886aceaa84790ac52c9a3004a8ce510774092fd40d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:53:36 crc kubenswrapper[5200]: E1126 13:53:36.137811 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56c3a4910d1c067d24414886aceaa84790ac52c9a3004a8ce510774092fd40d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:53:36 crc kubenswrapper[5200]: E1126 13:53:36.137861 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="da7ec657-e265-435a-a731-754cd19f9072" containerName="nova-scheduler-scheduler" probeResult="unknown" Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.221631 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.252915 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aae8bd0a-01d7-497d-b7fb-26c495823af7","Type":"ContainerStarted","Data":"26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8"} Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.252971 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.277442 5200 generic.go:358] "Generic (PLEG): container finished" podID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerID="5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292" exitCode=0 Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.277490 5200 generic.go:358] "Generic (PLEG): container finished" podID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerID="8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2" exitCode=2 Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.277526 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerDied","Data":"5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292"} Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.277570 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerDied","Data":"8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2"} Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.279406 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb","Type":"ContainerStarted","Data":"c3b964ab1121c55e197e9b9c929c6fd48c8f726021af4a3f167064718031b820"} Nov 26 13:53:36 crc kubenswrapper[5200]: I1126 13:53:36.280071 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.280059142 podStartE2EDuration="2.280059142s" podCreationTimestamp="2025-11-26 13:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:36.272675724 +0000 UTC m=+1384.951877542" watchObservedRunningTime="2025-11-26 13:53:36.280059142 +0000 UTC m=+1384.959260960" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.017978 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a99fd8-1db7-48a0-ad37-5652b13c7e00" path="/var/lib/kubelet/pods/20a99fd8-1db7-48a0-ad37-5652b13c7e00/volumes" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.215293 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.294938 5200 generic.go:358] "Generic (PLEG): container finished" podID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerID="0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465" exitCode=0 Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.295104 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerDied","Data":"0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465"} Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.297453 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb","Type":"ContainerStarted","Data":"956d596d413d6bc508604849be962ec96f688d90bc9bd6e3f6b5953f97e5186e"} Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.299019 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/kube-state-metrics-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.303045 5200 generic.go:358] "Generic (PLEG): container finished" podID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerID="419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b" exitCode=0 Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.303315 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7198255-2c57-440a-aafe-ac9fceb5889a","Type":"ContainerDied","Data":"419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b"} Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.303350 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7198255-2c57-440a-aafe-ac9fceb5889a","Type":"ContainerDied","Data":"4f9388ade7708d7af423a2f70499c0cd4a4dbc01600851ef89a775ae7b9c521f"} Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.303366 5200 scope.go:117] "RemoveContainer" containerID="419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.303464 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.320649 5200 generic.go:358] "Generic (PLEG): container finished" podID="da7ec657-e265-435a-a731-754cd19f9072" containerID="56c3a4910d1c067d24414886aceaa84790ac52c9a3004a8ce510774092fd40d6" exitCode=0 Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.320881 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da7ec657-e265-435a-a731-754cd19f9072","Type":"ContainerDied","Data":"56c3a4910d1c067d24414886aceaa84790ac52c9a3004a8ce510774092fd40d6"} Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.331382 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.95299395 podStartE2EDuration="2.331355986s" podCreationTimestamp="2025-11-26 13:53:35 +0000 UTC" firstStartedPulling="2025-11-26 13:53:36.22385682 +0000 UTC m=+1384.903058638" lastFinishedPulling="2025-11-26 13:53:36.602218856 +0000 UTC m=+1385.281420674" observedRunningTime="2025-11-26 13:53:37.312654189 +0000 UTC m=+1385.991856007" watchObservedRunningTime="2025-11-26 13:53:37.331355986 +0000 UTC m=+1386.010557814" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.339810 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-combined-ca-bundle\") pod \"d7198255-2c57-440a-aafe-ac9fceb5889a\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.340017 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc2hg\" (UniqueName: \"kubernetes.io/projected/d7198255-2c57-440a-aafe-ac9fceb5889a-kube-api-access-pc2hg\") pod \"d7198255-2c57-440a-aafe-ac9fceb5889a\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.340120 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-config-data\") pod \"d7198255-2c57-440a-aafe-ac9fceb5889a\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.340279 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7198255-2c57-440a-aafe-ac9fceb5889a-logs\") pod \"d7198255-2c57-440a-aafe-ac9fceb5889a\" (UID: \"d7198255-2c57-440a-aafe-ac9fceb5889a\") " Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.343367 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7198255-2c57-440a-aafe-ac9fceb5889a-logs" (OuterVolumeSpecName: "logs") pod "d7198255-2c57-440a-aafe-ac9fceb5889a" (UID: "d7198255-2c57-440a-aafe-ac9fceb5889a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.348713 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7198255-2c57-440a-aafe-ac9fceb5889a-kube-api-access-pc2hg" (OuterVolumeSpecName: "kube-api-access-pc2hg") pod "d7198255-2c57-440a-aafe-ac9fceb5889a" (UID: "d7198255-2c57-440a-aafe-ac9fceb5889a"). InnerVolumeSpecName "kube-api-access-pc2hg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.351259 5200 scope.go:117] "RemoveContainer" containerID="14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.371100 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.372060 5200 scope.go:117] "RemoveContainer" containerID="419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b" Nov 26 13:53:37 crc kubenswrapper[5200]: E1126 13:53:37.372410 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b\": container with ID starting with 419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b not found: ID does not exist" containerID="419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.372448 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b"} err="failed to get container status \"419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b\": rpc error: code = NotFound desc = could not find container \"419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b\": container with ID starting with 419a58d3ec5bb23b8eb559b67f6bc0113303e77e2c123abf140db97397923d2b not found: ID does not exist" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.372476 5200 scope.go:117] "RemoveContainer" containerID="14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399" Nov 26 13:53:37 crc kubenswrapper[5200]: E1126 13:53:37.373229 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399\": container with ID starting with 14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399 not found: ID does not exist" containerID="14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.373258 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399"} err="failed to get container status \"14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399\": rpc error: code = NotFound desc = could not find container \"14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399\": container with ID starting with 14302e16ac12416a41d753861504876d82d1efb96e67c2767d456a363db3b399 not found: ID does not exist" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.374330 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7198255-2c57-440a-aafe-ac9fceb5889a" (UID: "d7198255-2c57-440a-aafe-ac9fceb5889a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.381826 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-config-data" (OuterVolumeSpecName: "config-data") pod "d7198255-2c57-440a-aafe-ac9fceb5889a" (UID: "d7198255-2c57-440a-aafe-ac9fceb5889a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.442129 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmnbz\" (UniqueName: \"kubernetes.io/projected/da7ec657-e265-435a-a731-754cd19f9072-kube-api-access-rmnbz\") pod \"da7ec657-e265-435a-a731-754cd19f9072\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.442476 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-config-data\") pod \"da7ec657-e265-435a-a731-754cd19f9072\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.442612 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-combined-ca-bundle\") pod \"da7ec657-e265-435a-a731-754cd19f9072\" (UID: \"da7ec657-e265-435a-a731-754cd19f9072\") " Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.443194 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.443218 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pc2hg\" (UniqueName: \"kubernetes.io/projected/d7198255-2c57-440a-aafe-ac9fceb5889a-kube-api-access-pc2hg\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.443229 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7198255-2c57-440a-aafe-ac9fceb5889a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.443239 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7198255-2c57-440a-aafe-ac9fceb5889a-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.448522 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7ec657-e265-435a-a731-754cd19f9072-kube-api-access-rmnbz" (OuterVolumeSpecName: "kube-api-access-rmnbz") pod "da7ec657-e265-435a-a731-754cd19f9072" (UID: "da7ec657-e265-435a-a731-754cd19f9072"). InnerVolumeSpecName "kube-api-access-rmnbz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.493642 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-config-data" (OuterVolumeSpecName: "config-data") pod "da7ec657-e265-435a-a731-754cd19f9072" (UID: "da7ec657-e265-435a-a731-754cd19f9072"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.501869 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da7ec657-e265-435a-a731-754cd19f9072" (UID: "da7ec657-e265-435a-a731-754cd19f9072"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.545094 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rmnbz\" (UniqueName: \"kubernetes.io/projected/da7ec657-e265-435a-a731-754cd19f9072-kube-api-access-rmnbz\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.545141 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.545153 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7ec657-e265-435a-a731-754cd19f9072-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.665939 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.676539 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.684010 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.685405 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-api" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.685479 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-api" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.685573 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da7ec657-e265-435a-a731-754cd19f9072" containerName="nova-scheduler-scheduler" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.685636 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7ec657-e265-435a-a731-754cd19f9072" containerName="nova-scheduler-scheduler" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.685790 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-log" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.685848 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-log" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.686089 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-log" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.686165 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="da7ec657-e265-435a-a731-754cd19f9072" containerName="nova-scheduler-scheduler" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.686260 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" containerName="nova-api-api" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.696090 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.699899 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.717888 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.749259 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.749358 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-config-data\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.749453 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tbsb\" (UniqueName: \"kubernetes.io/projected/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-kube-api-access-9tbsb\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.749490 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-logs\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.851327 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-config-data\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.851432 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tbsb\" (UniqueName: \"kubernetes.io/projected/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-kube-api-access-9tbsb\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.851468 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-logs\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.851549 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.851993 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-logs\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.857323 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.870207 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-config-data\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:37 crc kubenswrapper[5200]: I1126 13:53:37.871282 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tbsb\" (UniqueName: \"kubernetes.io/projected/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-kube-api-access-9tbsb\") pod \"nova-api-0\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " pod="openstack/nova-api-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.018425 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.334024 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.334141 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da7ec657-e265-435a-a731-754cd19f9072","Type":"ContainerDied","Data":"06dd8730020af08e0c2a1952c274a87f13c21b8223c8d066451001ebdd12eba8"} Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.334850 5200 scope.go:117] "RemoveContainer" containerID="56c3a4910d1c067d24414886aceaa84790ac52c9a3004a8ce510774092fd40d6" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.403879 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.423261 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.436002 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.448436 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.451232 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.454939 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.496734 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.572466 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.573103 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-config-data\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.573301 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfz9s\" (UniqueName: \"kubernetes.io/projected/f4579235-0d34-4e19-bea7-7c0aac805971-kube-api-access-qfz9s\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.679840 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfz9s\" (UniqueName: \"kubernetes.io/projected/f4579235-0d34-4e19-bea7-7c0aac805971-kube-api-access-qfz9s\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.680105 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.680172 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-config-data\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.698084 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-config-data\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.698227 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.704422 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfz9s\" (UniqueName: \"kubernetes.io/projected/f4579235-0d34-4e19-bea7-7c0aac805971-kube-api-access-qfz9s\") pod \"nova-scheduler-0\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.776338 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.916678 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.981999 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7198255-2c57-440a-aafe-ac9fceb5889a" path="/var/lib/kubelet/pods/d7198255-2c57-440a-aafe-ac9fceb5889a/volumes" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.984770 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7ec657-e265-435a-a731-754cd19f9072" path="/var/lib/kubelet/pods/da7ec657-e265-435a-a731-754cd19f9072/volumes" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.986631 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-combined-ca-bundle\") pod \"43f0d583-75fa-4937-90ed-a4a7e564c081\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.986829 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-scripts\") pod \"43f0d583-75fa-4937-90ed-a4a7e564c081\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.987233 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-log-httpd\") pod \"43f0d583-75fa-4937-90ed-a4a7e564c081\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.987278 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-sg-core-conf-yaml\") pod \"43f0d583-75fa-4937-90ed-a4a7e564c081\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.987417 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrtkd\" (UniqueName: \"kubernetes.io/projected/43f0d583-75fa-4937-90ed-a4a7e564c081-kube-api-access-vrtkd\") pod \"43f0d583-75fa-4937-90ed-a4a7e564c081\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.987769 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "43f0d583-75fa-4937-90ed-a4a7e564c081" (UID: "43f0d583-75fa-4937-90ed-a4a7e564c081"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.987830 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-run-httpd\") pod \"43f0d583-75fa-4937-90ed-a4a7e564c081\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.987925 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-config-data\") pod \"43f0d583-75fa-4937-90ed-a4a7e564c081\" (UID: \"43f0d583-75fa-4937-90ed-a4a7e564c081\") " Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.988121 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "43f0d583-75fa-4937-90ed-a4a7e564c081" (UID: "43f0d583-75fa-4937-90ed-a4a7e564c081"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.988486 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.988501 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43f0d583-75fa-4937-90ed-a4a7e564c081-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:38 crc kubenswrapper[5200]: I1126 13:53:38.996412 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f0d583-75fa-4937-90ed-a4a7e564c081-kube-api-access-vrtkd" (OuterVolumeSpecName: "kube-api-access-vrtkd") pod "43f0d583-75fa-4937-90ed-a4a7e564c081" (UID: "43f0d583-75fa-4937-90ed-a4a7e564c081"). InnerVolumeSpecName "kube-api-access-vrtkd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.002900 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-scripts" (OuterVolumeSpecName: "scripts") pod "43f0d583-75fa-4937-90ed-a4a7e564c081" (UID: "43f0d583-75fa-4937-90ed-a4a7e564c081"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.027230 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "43f0d583-75fa-4937-90ed-a4a7e564c081" (UID: "43f0d583-75fa-4937-90ed-a4a7e564c081"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.090120 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43f0d583-75fa-4937-90ed-a4a7e564c081" (UID: "43f0d583-75fa-4937-90ed-a4a7e564c081"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.090780 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vrtkd\" (UniqueName: \"kubernetes.io/projected/43f0d583-75fa-4937-90ed-a4a7e564c081-kube-api-access-vrtkd\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.090865 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.090928 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.090990 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.107797 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-config-data" (OuterVolumeSpecName: "config-data") pod "43f0d583-75fa-4937-90ed-a4a7e564c081" (UID: "43f0d583-75fa-4937-90ed-a4a7e564c081"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.193242 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f0d583-75fa-4937-90ed-a4a7e564c081-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.304440 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.359444 5200 generic.go:358] "Generic (PLEG): container finished" podID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerID="c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519" exitCode=0 Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.359993 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerDied","Data":"c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519"} Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.360030 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43f0d583-75fa-4937-90ed-a4a7e564c081","Type":"ContainerDied","Data":"076f8accd7c27d6460e80f1a86aaf901223b7fcfd4049d5773aa23ac95b8fd2a"} Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.360055 5200 scope.go:117] "RemoveContainer" containerID="5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.360218 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.378719 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f4579235-0d34-4e19-bea7-7c0aac805971","Type":"ContainerStarted","Data":"c747f0af0159afdf65f2fbf0ec79edd2efed5687d24140675154d64089e7124b"} Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.386421 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a0d01d2-9f00-439e-bf14-ced1ad92cf39","Type":"ContainerStarted","Data":"3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79"} Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.386473 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a0d01d2-9f00-439e-bf14-ced1ad92cf39","Type":"ContainerStarted","Data":"05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677"} Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.386490 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a0d01d2-9f00-439e-bf14-ced1ad92cf39","Type":"ContainerStarted","Data":"dd76be84cf79c9725ec08e732f591bc6862afc4a067c0b74144ca8e87b9d069b"} Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.402078 5200 scope.go:117] "RemoveContainer" containerID="8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.420722 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.437565 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.449217 5200 scope.go:117] "RemoveContainer" containerID="c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.456129 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.457877 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="proxy-httpd" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.457904 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="proxy-httpd" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458142 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="ceilometer-central-agent" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458193 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="ceilometer-central-agent" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458222 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="ceilometer-notification-agent" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458232 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="ceilometer-notification-agent" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458303 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="sg-core" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458312 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="sg-core" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458729 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="proxy-httpd" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458780 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="ceilometer-central-agent" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458806 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="sg-core" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.458859 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" containerName="ceilometer-notification-agent" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.459793 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.45977446 podStartE2EDuration="2.45977446s" podCreationTimestamp="2025-11-26 13:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:39.437596085 +0000 UTC m=+1388.116797913" watchObservedRunningTime="2025-11-26 13:53:39.45977446 +0000 UTC m=+1388.138976278" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.470703 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.475905 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.476280 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.476524 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ceilometer-internal-svc\"" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.476618 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.525039 5200 scope.go:117] "RemoveContainer" containerID="0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.599217 5200 scope.go:117] "RemoveContainer" containerID="5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292" Nov 26 13:53:39 crc kubenswrapper[5200]: E1126 13:53:39.600260 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292\": container with ID starting with 5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292 not found: ID does not exist" containerID="5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.600345 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292"} err="failed to get container status \"5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292\": rpc error: code = NotFound desc = could not find container \"5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292\": container with ID starting with 5692d15775a118c39f452719f21457ddb7b1b15eed0e46446bce0db81bf68292 not found: ID does not exist" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.600381 5200 scope.go:117] "RemoveContainer" containerID="8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2" Nov 26 13:53:39 crc kubenswrapper[5200]: E1126 13:53:39.600911 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2\": container with ID starting with 8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2 not found: ID does not exist" containerID="8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.600946 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2"} err="failed to get container status \"8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2\": rpc error: code = NotFound desc = could not find container \"8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2\": container with ID starting with 8a72af3a1dc4887c54a922963e7163f353d5d75afe688aa1786d2430da2868e2 not found: ID does not exist" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.600968 5200 scope.go:117] "RemoveContainer" containerID="c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519" Nov 26 13:53:39 crc kubenswrapper[5200]: E1126 13:53:39.601462 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519\": container with ID starting with c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519 not found: ID does not exist" containerID="c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.601491 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519"} err="failed to get container status \"c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519\": rpc error: code = NotFound desc = could not find container \"c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519\": container with ID starting with c4c21884b07180205ea479db5defdae2c9963b8526d2f44f23248b92880ac519 not found: ID does not exist" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.601504 5200 scope.go:117] "RemoveContainer" containerID="0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465" Nov 26 13:53:39 crc kubenswrapper[5200]: E1126 13:53:39.601817 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465\": container with ID starting with 0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465 not found: ID does not exist" containerID="0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.601843 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465"} err="failed to get container status \"0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465\": rpc error: code = NotFound desc = could not find container \"0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465\": container with ID starting with 0201f4a9cc5b1f4affa47c47d33b28acbd69eb5239c93e8f2c6522685cfe4465 not found: ID does not exist" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.605573 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.605604 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-run-httpd\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.605904 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-scripts\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.606082 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.606110 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szkr8\" (UniqueName: \"kubernetes.io/projected/93f4c144-39f1-4496-9446-cd6e89d090eb-kube-api-access-szkr8\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.606202 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.606251 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-log-httpd\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.606319 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-config-data\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.708498 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-scripts\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.708922 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.708949 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szkr8\" (UniqueName: \"kubernetes.io/projected/93f4c144-39f1-4496-9446-cd6e89d090eb-kube-api-access-szkr8\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.708979 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.709006 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-log-httpd\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.709045 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-config-data\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.709157 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.709185 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-run-httpd\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.709552 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-log-httpd\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.709643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-run-httpd\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.714045 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.714104 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-scripts\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.714166 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-config-data\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.715332 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.716650 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.727477 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szkr8\" (UniqueName: \"kubernetes.io/projected/93f4c144-39f1-4496-9446-cd6e89d090eb-kube-api-access-szkr8\") pod \"ceilometer-0\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " pod="openstack/ceilometer-0" Nov 26 13:53:39 crc kubenswrapper[5200]: I1126 13:53:39.866322 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:53:40 crc kubenswrapper[5200]: I1126 13:53:40.403823 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f4579235-0d34-4e19-bea7-7c0aac805971","Type":"ContainerStarted","Data":"5fea24c254804c79e5df7d26759ac48074dfc197acbf27ba5f6dfbbeff994912"} Nov 26 13:53:40 crc kubenswrapper[5200]: I1126 13:53:40.411100 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:53:40 crc kubenswrapper[5200]: I1126 13:53:40.426746 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.426726376 podStartE2EDuration="2.426726376s" podCreationTimestamp="2025-11-26 13:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:40.422787806 +0000 UTC m=+1389.101989634" watchObservedRunningTime="2025-11-26 13:53:40.426726376 +0000 UTC m=+1389.105928194" Nov 26 13:53:40 crc kubenswrapper[5200]: I1126 13:53:40.984323 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f0d583-75fa-4937-90ed-a4a7e564c081" path="/var/lib/kubelet/pods/43f0d583-75fa-4937-90ed-a4a7e564c081/volumes" Nov 26 13:53:41 crc kubenswrapper[5200]: I1126 13:53:41.419980 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerStarted","Data":"0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591"} Nov 26 13:53:41 crc kubenswrapper[5200]: I1126 13:53:41.420069 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerStarted","Data":"2414df72be1fdf92a5fc0b7f53a305c5cad18001ba5e55a2633ab9522d0c02c7"} Nov 26 13:53:42 crc kubenswrapper[5200]: I1126 13:53:42.360038 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 26 13:53:42 crc kubenswrapper[5200]: I1126 13:53:42.439855 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerStarted","Data":"ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc"} Nov 26 13:53:42 crc kubenswrapper[5200]: E1126 13:53:42.463131 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1786216 actualBytes=10240 Nov 26 13:53:43 crc kubenswrapper[5200]: I1126 13:53:43.453832 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerStarted","Data":"45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3"} Nov 26 13:53:43 crc kubenswrapper[5200]: I1126 13:53:43.777330 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Nov 26 13:53:44 crc kubenswrapper[5200]: I1126 13:53:44.473937 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerStarted","Data":"631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9"} Nov 26 13:53:44 crc kubenswrapper[5200]: I1126 13:53:44.474069 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 13:53:44 crc kubenswrapper[5200]: I1126 13:53:44.506485 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.201626871 podStartE2EDuration="5.506460765s" podCreationTimestamp="2025-11-26 13:53:39 +0000 UTC" firstStartedPulling="2025-11-26 13:53:40.414622478 +0000 UTC m=+1389.093824296" lastFinishedPulling="2025-11-26 13:53:43.719456372 +0000 UTC m=+1392.398658190" observedRunningTime="2025-11-26 13:53:44.497585049 +0000 UTC m=+1393.176786947" watchObservedRunningTime="2025-11-26 13:53:44.506460765 +0000 UTC m=+1393.185662623" Nov 26 13:53:48 crc kubenswrapper[5200]: I1126 13:53:48.018996 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 13:53:48 crc kubenswrapper[5200]: I1126 13:53:48.019473 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 13:53:48 crc kubenswrapper[5200]: I1126 13:53:48.777668 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 13:53:48 crc kubenswrapper[5200]: I1126 13:53:48.833402 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 13:53:49 crc kubenswrapper[5200]: I1126 13:53:49.059953 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 13:53:49 crc kubenswrapper[5200]: I1126 13:53:49.060931 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 13:53:49 crc kubenswrapper[5200]: I1126 13:53:49.558069 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 26 13:53:49 crc kubenswrapper[5200]: I1126 13:53:49.562755 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 13:53:49 crc kubenswrapper[5200]: E1126 13:53:49.653175 5200 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/a8c2dbfd9073789e226d9969fe5df38fccd4d830072ed932632d6d944e3f9ecd/diff" to get inode usage: stat /var/lib/containers/storage/overlay/a8c2dbfd9073789e226d9969fe5df38fccd4d830072ed932632d6d944e3f9ecd/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_43f0d583-75fa-4937-90ed-a4a7e564c081/ceilometer-central-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_43f0d583-75fa-4937-90ed-a4a7e564c081/ceilometer-central-agent/0.log: no such file or directory Nov 26 13:53:50 crc kubenswrapper[5200]: E1126 13:53:50.386537 5200 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/96f93290797fe25b34150bb741df8a85e840770d35e79a92b678570458b40fff/diff" to get inode usage: stat /var/lib/containers/storage/overlay/96f93290797fe25b34150bb741df8a85e840770d35e79a92b678570458b40fff/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_43f0d583-75fa-4937-90ed-a4a7e564c081/ceilometer-notification-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_43f0d583-75fa-4937-90ed-a4a7e564c081/ceilometer-notification-agent/0.log: no such file or directory Nov 26 13:53:55 crc kubenswrapper[5200]: E1126 13:53:55.377592 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e48d05_59f9_42a8_bc88_60ade1961efb.slice/crio-d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e48d05_59f9_42a8_bc88_60ade1961efb.slice/crio-conmon-d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f.scope\": RecentStats: unable to find data in memory cache]" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.509294 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.516850 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.585339 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0329d704-9dd2-403c-87b1-73794a949de0-logs\") pod \"0329d704-9dd2-403c-87b1-73794a949de0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.585478 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w29n\" (UniqueName: \"kubernetes.io/projected/0329d704-9dd2-403c-87b1-73794a949de0-kube-api-access-2w29n\") pod \"0329d704-9dd2-403c-87b1-73794a949de0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.585530 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-combined-ca-bundle\") pod \"0329d704-9dd2-403c-87b1-73794a949de0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.585560 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n6jj\" (UniqueName: \"kubernetes.io/projected/21e48d05-59f9-42a8-bc88-60ade1961efb-kube-api-access-2n6jj\") pod \"21e48d05-59f9-42a8-bc88-60ade1961efb\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.585722 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-combined-ca-bundle\") pod \"21e48d05-59f9-42a8-bc88-60ade1961efb\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.585767 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0329d704-9dd2-403c-87b1-73794a949de0-logs" (OuterVolumeSpecName: "logs") pod "0329d704-9dd2-403c-87b1-73794a949de0" (UID: "0329d704-9dd2-403c-87b1-73794a949de0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.586026 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-config-data\") pod \"21e48d05-59f9-42a8-bc88-60ade1961efb\" (UID: \"21e48d05-59f9-42a8-bc88-60ade1961efb\") " Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.586077 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-config-data\") pod \"0329d704-9dd2-403c-87b1-73794a949de0\" (UID: \"0329d704-9dd2-403c-87b1-73794a949de0\") " Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.586495 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0329d704-9dd2-403c-87b1-73794a949de0-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.593226 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0329d704-9dd2-403c-87b1-73794a949de0-kube-api-access-2w29n" (OuterVolumeSpecName: "kube-api-access-2w29n") pod "0329d704-9dd2-403c-87b1-73794a949de0" (UID: "0329d704-9dd2-403c-87b1-73794a949de0"). InnerVolumeSpecName "kube-api-access-2w29n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.593946 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e48d05-59f9-42a8-bc88-60ade1961efb-kube-api-access-2n6jj" (OuterVolumeSpecName: "kube-api-access-2n6jj") pod "21e48d05-59f9-42a8-bc88-60ade1961efb" (UID: "21e48d05-59f9-42a8-bc88-60ade1961efb"). InnerVolumeSpecName "kube-api-access-2n6jj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.599504 5200 generic.go:358] "Generic (PLEG): container finished" podID="21e48d05-59f9-42a8-bc88-60ade1961efb" containerID="d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f" exitCode=137 Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.599696 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21e48d05-59f9-42a8-bc88-60ade1961efb","Type":"ContainerDied","Data":"d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f"} Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.599724 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.599761 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21e48d05-59f9-42a8-bc88-60ade1961efb","Type":"ContainerDied","Data":"e0aafcfd182b7b078de36ab8f11f86a3b865e31c577a176e9f478f4f8c6ea01a"} Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.599785 5200 scope.go:117] "RemoveContainer" containerID="d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.603808 5200 generic.go:358] "Generic (PLEG): container finished" podID="0329d704-9dd2-403c-87b1-73794a949de0" containerID="c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d" exitCode=137 Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.604294 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0329d704-9dd2-403c-87b1-73794a949de0","Type":"ContainerDied","Data":"c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d"} Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.604328 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0329d704-9dd2-403c-87b1-73794a949de0","Type":"ContainerDied","Data":"f6f468d5698438d3332199dcd5667b5c9e60ea0bb19bc5ebb9df2b1f71a95f7e"} Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.604389 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.620677 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21e48d05-59f9-42a8-bc88-60ade1961efb" (UID: "21e48d05-59f9-42a8-bc88-60ade1961efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.622745 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0329d704-9dd2-403c-87b1-73794a949de0" (UID: "0329d704-9dd2-403c-87b1-73794a949de0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.625123 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-config-data" (OuterVolumeSpecName: "config-data") pod "21e48d05-59f9-42a8-bc88-60ade1961efb" (UID: "21e48d05-59f9-42a8-bc88-60ade1961efb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.632303 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-config-data" (OuterVolumeSpecName: "config-data") pod "0329d704-9dd2-403c-87b1-73794a949de0" (UID: "0329d704-9dd2-403c-87b1-73794a949de0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.688815 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.688863 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.688903 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2w29n\" (UniqueName: \"kubernetes.io/projected/0329d704-9dd2-403c-87b1-73794a949de0-kube-api-access-2w29n\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.688916 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0329d704-9dd2-403c-87b1-73794a949de0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.688926 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2n6jj\" (UniqueName: \"kubernetes.io/projected/21e48d05-59f9-42a8-bc88-60ade1961efb-kube-api-access-2n6jj\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.688934 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e48d05-59f9-42a8-bc88-60ade1961efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.697553 5200 scope.go:117] "RemoveContainer" containerID="d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f" Nov 26 13:53:55 crc kubenswrapper[5200]: E1126 13:53:55.698030 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f\": container with ID starting with d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f not found: ID does not exist" containerID="d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.698069 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f"} err="failed to get container status \"d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f\": rpc error: code = NotFound desc = could not find container \"d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f\": container with ID starting with d9d8e0bb07c38b7d0281ee18ff4fa9cb26bffb4869de74a9660d97208f56540f not found: ID does not exist" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.698091 5200 scope.go:117] "RemoveContainer" containerID="c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.727219 5200 scope.go:117] "RemoveContainer" containerID="e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.749943 5200 scope.go:117] "RemoveContainer" containerID="c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d" Nov 26 13:53:55 crc kubenswrapper[5200]: E1126 13:53:55.750375 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d\": container with ID starting with c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d not found: ID does not exist" containerID="c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.750428 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d"} err="failed to get container status \"c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d\": rpc error: code = NotFound desc = could not find container \"c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d\": container with ID starting with c7418b7cbad40ce590c0a3242cda7539740ecac25e5b277e2269e3650345413d not found: ID does not exist" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.750458 5200 scope.go:117] "RemoveContainer" containerID="e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9" Nov 26 13:53:55 crc kubenswrapper[5200]: E1126 13:53:55.750898 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9\": container with ID starting with e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9 not found: ID does not exist" containerID="e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.750933 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9"} err="failed to get container status \"e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9\": rpc error: code = NotFound desc = could not find container \"e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9\": container with ID starting with e3973458ad7bb08149048c81cff611ac5da26e405b517505311f01438f90a7d9 not found: ID does not exist" Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.962503 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.982676 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:55 crc kubenswrapper[5200]: I1126 13:53:55.991808 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.008071 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.017470 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.019860 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21e48d05-59f9-42a8-bc88-60ade1961efb" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.019919 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e48d05-59f9-42a8-bc88-60ade1961efb" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.019992 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0329d704-9dd2-403c-87b1-73794a949de0" containerName="nova-metadata-metadata" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.020005 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0329d704-9dd2-403c-87b1-73794a949de0" containerName="nova-metadata-metadata" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.020054 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0329d704-9dd2-403c-87b1-73794a949de0" containerName="nova-metadata-log" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.020065 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0329d704-9dd2-403c-87b1-73794a949de0" containerName="nova-metadata-log" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.020380 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0329d704-9dd2-403c-87b1-73794a949de0" containerName="nova-metadata-log" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.020420 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="21e48d05-59f9-42a8-bc88-60ade1961efb" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.020445 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0329d704-9dd2-403c-87b1-73794a949de0" containerName="nova-metadata-metadata" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.037927 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.039764 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.043981 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-novncproxy-cell1-vencrypt\"" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.044057 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-novncproxy-cell1-public-svc\"" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.044371 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-novncproxy-config-data\"" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.053116 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.053197 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.053298 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.059745 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.068621 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-metadata-internal-svc\"" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199141 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199222 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdms\" (UniqueName: \"kubernetes.io/projected/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-kube-api-access-2jdms\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199259 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199285 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199320 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqv7n\" (UniqueName: \"kubernetes.io/projected/42c0a167-4edb-4321-b3c6-3790685f03e2-kube-api-access-xqv7n\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199336 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199391 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-config-data\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199425 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199609 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.199771 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c0a167-4edb-4321-b3c6-3790685f03e2-logs\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.301862 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.301971 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.302052 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqv7n\" (UniqueName: \"kubernetes.io/projected/42c0a167-4edb-4321-b3c6-3790685f03e2-kube-api-access-xqv7n\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.302087 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.302183 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-config-data\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.302249 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.302303 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.302340 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c0a167-4edb-4321-b3c6-3790685f03e2-logs\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.302442 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.303201 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c0a167-4edb-4321-b3c6-3790685f03e2-logs\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.303405 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdms\" (UniqueName: \"kubernetes.io/projected/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-kube-api-access-2jdms\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.308361 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.309607 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.311212 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.312606 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-config-data\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.312873 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.312969 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.313299 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.326078 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdms\" (UniqueName: \"kubernetes.io/projected/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-kube-api-access-2jdms\") pod \"nova-cell1-novncproxy-0\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.326881 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqv7n\" (UniqueName: \"kubernetes.io/projected/42c0a167-4edb-4321-b3c6-3790685f03e2-kube-api-access-xqv7n\") pod \"nova-metadata-0\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.369343 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.382877 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.987560 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0329d704-9dd2-403c-87b1-73794a949de0" path="/var/lib/kubelet/pods/0329d704-9dd2-403c-87b1-73794a949de0/volumes" Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.988873 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e48d05-59f9-42a8-bc88-60ade1961efb" path="/var/lib/kubelet/pods/21e48d05-59f9-42a8-bc88-60ade1961efb/volumes" Nov 26 13:53:56 crc kubenswrapper[5200]: W1126 13:53:56.988952 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c0a167_4edb_4321_b3c6_3790685f03e2.slice/crio-72a4d137f01fb460e34de9964af075f46a0d1d2c0a7426665e876216619911c0 WatchSource:0}: Error finding container 72a4d137f01fb460e34de9964af075f46a0d1d2c0a7426665e876216619911c0: Status 404 returned error can't find the container with id 72a4d137f01fb460e34de9964af075f46a0d1d2c0a7426665e876216619911c0 Nov 26 13:53:56 crc kubenswrapper[5200]: I1126 13:53:56.989466 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:53:57 crc kubenswrapper[5200]: W1126 13:53:57.073555 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58fe5504_07bd_4c4d_a2b4_3fbaa402649d.slice/crio-fc5e90d97911d7b404dc238e2ac23acd87020306506110bb00c6631f40913e68 WatchSource:0}: Error finding container fc5e90d97911d7b404dc238e2ac23acd87020306506110bb00c6631f40913e68: Status 404 returned error can't find the container with id fc5e90d97911d7b404dc238e2ac23acd87020306506110bb00c6631f40913e68 Nov 26 13:53:57 crc kubenswrapper[5200]: I1126 13:53:57.079303 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:53:57 crc kubenswrapper[5200]: I1126 13:53:57.624991 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"58fe5504-07bd-4c4d-a2b4-3fbaa402649d","Type":"ContainerStarted","Data":"dae2b75b63a59901de07ca31786301f42b38647175de1bf352b0350a4ab170d9"} Nov 26 13:53:57 crc kubenswrapper[5200]: I1126 13:53:57.625479 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"58fe5504-07bd-4c4d-a2b4-3fbaa402649d","Type":"ContainerStarted","Data":"fc5e90d97911d7b404dc238e2ac23acd87020306506110bb00c6631f40913e68"} Nov 26 13:53:57 crc kubenswrapper[5200]: I1126 13:53:57.627352 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42c0a167-4edb-4321-b3c6-3790685f03e2","Type":"ContainerStarted","Data":"bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2"} Nov 26 13:53:57 crc kubenswrapper[5200]: I1126 13:53:57.627392 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42c0a167-4edb-4321-b3c6-3790685f03e2","Type":"ContainerStarted","Data":"b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a"} Nov 26 13:53:57 crc kubenswrapper[5200]: I1126 13:53:57.627404 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42c0a167-4edb-4321-b3c6-3790685f03e2","Type":"ContainerStarted","Data":"72a4d137f01fb460e34de9964af075f46a0d1d2c0a7426665e876216619911c0"} Nov 26 13:53:57 crc kubenswrapper[5200]: I1126 13:53:57.681218 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.681194218 podStartE2EDuration="2.681194218s" podCreationTimestamp="2025-11-26 13:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:57.646597037 +0000 UTC m=+1406.325798865" watchObservedRunningTime="2025-11-26 13:53:57.681194218 +0000 UTC m=+1406.360396046" Nov 26 13:53:57 crc kubenswrapper[5200]: I1126 13:53:57.689260 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.689232333 podStartE2EDuration="2.689232333s" podCreationTimestamp="2025-11-26 13:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:53:57.670289341 +0000 UTC m=+1406.349491209" watchObservedRunningTime="2025-11-26 13:53:57.689232333 +0000 UTC m=+1406.368434151" Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.028035 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.029006 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.031309 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.033669 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.669727 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.675998 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.879041 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58f7d474f5-w7lqk"] Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.894008 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f7d474f5-w7lqk"] Nov 26 13:53:58 crc kubenswrapper[5200]: I1126 13:53:58.894170 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.074241 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-sb\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.074287 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-swift-storage-0\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.074317 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtpxf\" (UniqueName: \"kubernetes.io/projected/4972e45b-4085-4bbb-a052-95698f889c49-kube-api-access-vtpxf\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.074355 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-svc\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.074832 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-config\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.075208 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-nb\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.176780 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtpxf\" (UniqueName: \"kubernetes.io/projected/4972e45b-4085-4bbb-a052-95698f889c49-kube-api-access-vtpxf\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.176855 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-svc\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.176933 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-config\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.177033 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-nb\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.177060 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-sb\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.177081 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-swift-storage-0\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.177980 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-swift-storage-0\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.178217 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-config\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.178320 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-nb\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.178559 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-svc\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.178810 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-sb\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.212099 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtpxf\" (UniqueName: \"kubernetes.io/projected/4972e45b-4085-4bbb-a052-95698f889c49-kube-api-access-vtpxf\") pod \"dnsmasq-dns-58f7d474f5-w7lqk\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.219501 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:53:59 crc kubenswrapper[5200]: I1126 13:53:59.709351 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f7d474f5-w7lqk"] Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.686999 5200 generic.go:358] "Generic (PLEG): container finished" podID="4972e45b-4085-4bbb-a052-95698f889c49" containerID="7c7d2aa287af6438dfeaebe64c1768b07474f9d5fc766f50b84d23a182c3422a" exitCode=0 Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.687103 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" event={"ID":"4972e45b-4085-4bbb-a052-95698f889c49","Type":"ContainerDied","Data":"7c7d2aa287af6438dfeaebe64c1768b07474f9d5fc766f50b84d23a182c3422a"} Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.687602 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" event={"ID":"4972e45b-4085-4bbb-a052-95698f889c49","Type":"ContainerStarted","Data":"083b5aebf9fd043c758835208e058a965da03cd1181cea157c7cd3eaf83bb10e"} Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.754409 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.754783 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="ceilometer-central-agent" containerID="cri-o://0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591" gracePeriod=30 Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.755799 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="proxy-httpd" containerID="cri-o://631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9" gracePeriod=30 Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.755930 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="sg-core" containerID="cri-o://45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3" gracePeriod=30 Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.755985 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="ceilometer-notification-agent" containerID="cri-o://ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc" gracePeriod=30 Nov 26 13:54:00 crc kubenswrapper[5200]: I1126 13:54:00.788780 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.191:3000/\": EOF" Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.242436 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.370104 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.383285 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.383426 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.700726 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerDied","Data":"631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9"} Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.700739 5200 generic.go:358] "Generic (PLEG): container finished" podID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerID="631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9" exitCode=0 Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.700798 5200 generic.go:358] "Generic (PLEG): container finished" podID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerID="45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3" exitCode=2 Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.700810 5200 generic.go:358] "Generic (PLEG): container finished" podID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerID="0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591" exitCode=0 Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.701025 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerDied","Data":"45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3"} Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.701103 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerDied","Data":"0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591"} Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.704000 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" event={"ID":"4972e45b-4085-4bbb-a052-95698f889c49","Type":"ContainerStarted","Data":"e57e9a52f8cde4ed409c4daef24f8b64435e90ed6915c4ccca0b63cbaa9821a8"} Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.704250 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.704251 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-log" containerID="cri-o://05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677" gracePeriod=30 Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.704301 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-api" containerID="cri-o://3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79" gracePeriod=30 Nov 26 13:54:01 crc kubenswrapper[5200]: I1126 13:54:01.731971 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" podStartSLOduration=3.73195099 podStartE2EDuration="3.73195099s" podCreationTimestamp="2025-11-26 13:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:54:01.727437795 +0000 UTC m=+1410.406639623" watchObservedRunningTime="2025-11-26 13:54:01.73195099 +0000 UTC m=+1410.411152798" Nov 26 13:54:02 crc kubenswrapper[5200]: I1126 13:54:02.724650 5200 generic.go:358] "Generic (PLEG): container finished" podID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerID="05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677" exitCode=143 Nov 26 13:54:02 crc kubenswrapper[5200]: I1126 13:54:02.725701 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a0d01d2-9f00-439e-bf14-ced1ad92cf39","Type":"ContainerDied","Data":"05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677"} Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.154042 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.154369 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.338652 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.363065 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-sg-core-conf-yaml\") pod \"93f4c144-39f1-4496-9446-cd6e89d090eb\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.363187 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szkr8\" (UniqueName: \"kubernetes.io/projected/93f4c144-39f1-4496-9446-cd6e89d090eb-kube-api-access-szkr8\") pod \"93f4c144-39f1-4496-9446-cd6e89d090eb\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.363215 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-ceilometer-tls-certs\") pod \"93f4c144-39f1-4496-9446-cd6e89d090eb\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.363247 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-combined-ca-bundle\") pod \"93f4c144-39f1-4496-9446-cd6e89d090eb\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.363278 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-log-httpd\") pod \"93f4c144-39f1-4496-9446-cd6e89d090eb\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.363327 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-run-httpd\") pod \"93f4c144-39f1-4496-9446-cd6e89d090eb\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.363469 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-config-data\") pod \"93f4c144-39f1-4496-9446-cd6e89d090eb\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.363521 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-scripts\") pod \"93f4c144-39f1-4496-9446-cd6e89d090eb\" (UID: \"93f4c144-39f1-4496-9446-cd6e89d090eb\") " Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.365473 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93f4c144-39f1-4496-9446-cd6e89d090eb" (UID: "93f4c144-39f1-4496-9446-cd6e89d090eb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.365804 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93f4c144-39f1-4496-9446-cd6e89d090eb" (UID: "93f4c144-39f1-4496-9446-cd6e89d090eb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.373985 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f4c144-39f1-4496-9446-cd6e89d090eb-kube-api-access-szkr8" (OuterVolumeSpecName: "kube-api-access-szkr8") pod "93f4c144-39f1-4496-9446-cd6e89d090eb" (UID: "93f4c144-39f1-4496-9446-cd6e89d090eb"). InnerVolumeSpecName "kube-api-access-szkr8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.375833 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-scripts" (OuterVolumeSpecName: "scripts") pod "93f4c144-39f1-4496-9446-cd6e89d090eb" (UID: "93f4c144-39f1-4496-9446-cd6e89d090eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.401000 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93f4c144-39f1-4496-9446-cd6e89d090eb" (UID: "93f4c144-39f1-4496-9446-cd6e89d090eb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.436637 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "93f4c144-39f1-4496-9446-cd6e89d090eb" (UID: "93f4c144-39f1-4496-9446-cd6e89d090eb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.467531 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.467569 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.467583 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szkr8\" (UniqueName: \"kubernetes.io/projected/93f4c144-39f1-4496-9446-cd6e89d090eb-kube-api-access-szkr8\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.467595 5200 reconciler_common.go:299] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.467606 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.467615 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f4c144-39f1-4496-9446-cd6e89d090eb-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.484463 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93f4c144-39f1-4496-9446-cd6e89d090eb" (UID: "93f4c144-39f1-4496-9446-cd6e89d090eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.497415 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-config-data" (OuterVolumeSpecName: "config-data") pod "93f4c144-39f1-4496-9446-cd6e89d090eb" (UID: "93f4c144-39f1-4496-9446-cd6e89d090eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.569160 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.569492 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f4c144-39f1-4496-9446-cd6e89d090eb-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.745657 5200 generic.go:358] "Generic (PLEG): container finished" podID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerID="ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc" exitCode=0 Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.745738 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerDied","Data":"ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc"} Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.745783 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f4c144-39f1-4496-9446-cd6e89d090eb","Type":"ContainerDied","Data":"2414df72be1fdf92a5fc0b7f53a305c5cad18001ba5e55a2633ab9522d0c02c7"} Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.745786 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.745805 5200 scope.go:117] "RemoveContainer" containerID="631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.783842 5200 scope.go:117] "RemoveContainer" containerID="45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.795777 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.814238 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.821425 5200 scope.go:117] "RemoveContainer" containerID="ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.823182 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824627 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="ceilometer-notification-agent" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824654 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="ceilometer-notification-agent" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824668 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="sg-core" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824674 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="sg-core" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824704 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="proxy-httpd" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824711 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="proxy-httpd" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824772 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="ceilometer-central-agent" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824777 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="ceilometer-central-agent" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824948 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="proxy-httpd" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824972 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="ceilometer-central-agent" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824981 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="sg-core" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.824991 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" containerName="ceilometer-notification-agent" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.833670 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.838479 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.878480 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.878568 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ceilometer-internal-svc\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.878719 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.880073 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.880138 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.880160 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.880182 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-run-httpd\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.880261 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-scripts\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.880292 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6vk7\" (UniqueName: \"kubernetes.io/projected/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-kube-api-access-m6vk7\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.880312 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-log-httpd\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.880395 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-config-data\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.904571 5200 scope.go:117] "RemoveContainer" containerID="0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.924273 5200 scope.go:117] "RemoveContainer" containerID="631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9" Nov 26 13:54:03 crc kubenswrapper[5200]: E1126 13:54:03.924791 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9\": container with ID starting with 631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9 not found: ID does not exist" containerID="631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.924835 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9"} err="failed to get container status \"631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9\": rpc error: code = NotFound desc = could not find container \"631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9\": container with ID starting with 631557ae9f29db0691482d15b2ed3d265bddea974abeca11738802fdf7a237b9 not found: ID does not exist" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.924863 5200 scope.go:117] "RemoveContainer" containerID="45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3" Nov 26 13:54:03 crc kubenswrapper[5200]: E1126 13:54:03.925152 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3\": container with ID starting with 45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3 not found: ID does not exist" containerID="45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.925186 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3"} err="failed to get container status \"45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3\": rpc error: code = NotFound desc = could not find container \"45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3\": container with ID starting with 45e96ed601080ce783c68a10a5a5d0146ac16f9f8fb72da939cd0cf84f321dd3 not found: ID does not exist" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.925206 5200 scope.go:117] "RemoveContainer" containerID="ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc" Nov 26 13:54:03 crc kubenswrapper[5200]: E1126 13:54:03.925654 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc\": container with ID starting with ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc not found: ID does not exist" containerID="ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.925717 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc"} err="failed to get container status \"ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc\": rpc error: code = NotFound desc = could not find container \"ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc\": container with ID starting with ba061cfd86fb8ad6ab233dc9cb1ce6d58d100f7d264b40ed2a1b9703bc2d37dc not found: ID does not exist" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.925756 5200 scope.go:117] "RemoveContainer" containerID="0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591" Nov 26 13:54:03 crc kubenswrapper[5200]: E1126 13:54:03.926017 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591\": container with ID starting with 0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591 not found: ID does not exist" containerID="0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.926042 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591"} err="failed to get container status \"0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591\": rpc error: code = NotFound desc = could not find container \"0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591\": container with ID starting with 0846363a85c96e8f63e0c085857eb3630ecd238553f2350374b994891eb1b591 not found: ID does not exist" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.982123 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-scripts\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.982167 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6vk7\" (UniqueName: \"kubernetes.io/projected/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-kube-api-access-m6vk7\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.982454 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-log-httpd\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.982735 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-config-data\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.982963 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-log-httpd\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.984021 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.984812 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.985084 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.985114 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-run-httpd\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.986406 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-run-httpd\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.988919 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-scripts\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.989242 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.989338 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-config-data\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.996419 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.996962 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:03 crc kubenswrapper[5200]: I1126 13:54:03.999121 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6vk7\" (UniqueName: \"kubernetes.io/projected/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-kube-api-access-m6vk7\") pod \"ceilometer-0\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " pod="openstack/ceilometer-0" Nov 26 13:54:04 crc kubenswrapper[5200]: I1126 13:54:04.197793 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:54:04 crc kubenswrapper[5200]: I1126 13:54:04.745553 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:54:04 crc kubenswrapper[5200]: I1126 13:54:04.765302 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerStarted","Data":"ca264c34083f559cbdf6d65c618fba39485e4e5b470cbbf0162a97b8278f710d"} Nov 26 13:54:04 crc kubenswrapper[5200]: I1126 13:54:04.992700 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f4c144-39f1-4496-9446-cd6e89d090eb" path="/var/lib/kubelet/pods/93f4c144-39f1-4496-9446-cd6e89d090eb/volumes" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.427257 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.518753 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-combined-ca-bundle\") pod \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.518973 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tbsb\" (UniqueName: \"kubernetes.io/projected/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-kube-api-access-9tbsb\") pod \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.519087 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-config-data\") pod \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.519121 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-logs\") pod \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\" (UID: \"0a0d01d2-9f00-439e-bf14-ced1ad92cf39\") " Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.521030 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-logs" (OuterVolumeSpecName: "logs") pod "0a0d01d2-9f00-439e-bf14-ced1ad92cf39" (UID: "0a0d01d2-9f00-439e-bf14-ced1ad92cf39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.527889 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-kube-api-access-9tbsb" (OuterVolumeSpecName: "kube-api-access-9tbsb") pod "0a0d01d2-9f00-439e-bf14-ced1ad92cf39" (UID: "0a0d01d2-9f00-439e-bf14-ced1ad92cf39"). InnerVolumeSpecName "kube-api-access-9tbsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.549060 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a0d01d2-9f00-439e-bf14-ced1ad92cf39" (UID: "0a0d01d2-9f00-439e-bf14-ced1ad92cf39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.559542 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-config-data" (OuterVolumeSpecName: "config-data") pod "0a0d01d2-9f00-439e-bf14-ced1ad92cf39" (UID: "0a0d01d2-9f00-439e-bf14-ced1ad92cf39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.624548 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.624583 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9tbsb\" (UniqueName: \"kubernetes.io/projected/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-kube-api-access-9tbsb\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.624595 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.624605 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a0d01d2-9f00-439e-bf14-ced1ad92cf39-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.776927 5200 generic.go:358] "Generic (PLEG): container finished" podID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerID="3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79" exitCode=0 Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.777008 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.777023 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a0d01d2-9f00-439e-bf14-ced1ad92cf39","Type":"ContainerDied","Data":"3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79"} Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.777074 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a0d01d2-9f00-439e-bf14-ced1ad92cf39","Type":"ContainerDied","Data":"dd76be84cf79c9725ec08e732f591bc6862afc4a067c0b74144ca8e87b9d069b"} Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.777094 5200 scope.go:117] "RemoveContainer" containerID="3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.780118 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerStarted","Data":"b908cb3b6ebf4aeb47d606fa2e54c25261e51e4dbd764b3dd457699d4b48474a"} Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.810571 5200 scope.go:117] "RemoveContainer" containerID="05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.819269 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.868499 5200 scope.go:117] "RemoveContainer" containerID="3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79" Nov 26 13:54:05 crc kubenswrapper[5200]: E1126 13:54:05.869118 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79\": container with ID starting with 3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79 not found: ID does not exist" containerID="3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.869150 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79"} err="failed to get container status \"3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79\": rpc error: code = NotFound desc = could not find container \"3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79\": container with ID starting with 3209fd6895fecec36fd66dc551eaa41fc065171e46842551340ac1dca19c3e79 not found: ID does not exist" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.869185 5200 scope.go:117] "RemoveContainer" containerID="05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677" Nov 26 13:54:05 crc kubenswrapper[5200]: E1126 13:54:05.873001 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677\": container with ID starting with 05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677 not found: ID does not exist" containerID="05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.873046 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677"} err="failed to get container status \"05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677\": rpc error: code = NotFound desc = could not find container \"05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677\": container with ID starting with 05c164311f7b847977196309f9bbdbcd617ac47cf115a34f33b731a4660da677 not found: ID does not exist" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.878373 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.908763 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.910000 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-api" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.910020 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-api" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.910035 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-log" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.910041 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-log" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.910263 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-api" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.910292 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" containerName="nova-api-log" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.916440 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.918668 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.920016 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.920160 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-internal-svc\"" Nov 26 13:54:05 crc kubenswrapper[5200]: I1126 13:54:05.920315 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-public-svc\"" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.039929 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.040038 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.040170 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.040222 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-logs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.040418 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-config-data\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.040444 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8kb\" (UniqueName: \"kubernetes.io/projected/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-kube-api-access-st8kb\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.142248 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.142642 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.142662 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-logs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.142714 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-config-data\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.142730 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-st8kb\" (UniqueName: \"kubernetes.io/projected/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-kube-api-access-st8kb\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.142840 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.144284 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-logs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.148428 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.148508 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-config-data\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.149310 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.153633 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.170539 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8kb\" (UniqueName: \"kubernetes.io/projected/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-kube-api-access-st8kb\") pod \"nova-api-0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.244954 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.371838 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.383905 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.383994 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.393280 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:54:06 crc kubenswrapper[5200]: W1126 13:54:06.542848 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b6ad681_8b7d_459f_ab2a_1130ab8c96b0.slice/crio-49afc7156975318249de4eafb6bcbc630900ccf87a73dff7eb772604a536add8 WatchSource:0}: Error finding container 49afc7156975318249de4eafb6bcbc630900ccf87a73dff7eb772604a536add8: Status 404 returned error can't find the container with id 49afc7156975318249de4eafb6bcbc630900ccf87a73dff7eb772604a536add8 Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.547657 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.794571 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0","Type":"ContainerStarted","Data":"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4"} Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.795578 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0","Type":"ContainerStarted","Data":"49afc7156975318249de4eafb6bcbc630900ccf87a73dff7eb772604a536add8"} Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.807026 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerStarted","Data":"52bb01fb6b98b6d753a0b12d991a396878b553610972bbf7719db7a828b9c734"} Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.842054 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:54:06 crc kubenswrapper[5200]: I1126 13:54:06.995443 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0d01d2-9f00-439e-bf14-ced1ad92cf39" path="/var/lib/kubelet/pods/0a0d01d2-9f00-439e-bf14-ced1ad92cf39/volumes" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.032973 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-58ghr"] Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.038485 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.040963 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-manage-scripts\"" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.042326 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-manage-config-data\"" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.068571 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-58ghr"] Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.182982 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkcx\" (UniqueName: \"kubernetes.io/projected/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-kube-api-access-hwkcx\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.183601 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-scripts\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.183645 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.183840 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-config-data\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.286147 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkcx\" (UniqueName: \"kubernetes.io/projected/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-kube-api-access-hwkcx\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.286241 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-scripts\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.286274 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.286310 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-config-data\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.291513 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.292866 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-scripts\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.292882 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-config-data\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.326215 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkcx\" (UniqueName: \"kubernetes.io/projected/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-kube-api-access-hwkcx\") pod \"nova-cell1-cell-mapping-58ghr\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.366170 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.392946 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.394428 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.718137 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-58ghr"] Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.726838 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:54:07 crc kubenswrapper[5200]: W1126 13:54:07.731321 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bbe3ea8_34d8_4d29_a552_f7408d9a66d5.slice/crio-b5b194a72082703ba3043dca58b4d7c942cd697579ff87ecdc09aafd304ede8f WatchSource:0}: Error finding container b5b194a72082703ba3043dca58b4d7c942cd697579ff87ecdc09aafd304ede8f: Status 404 returned error can't find the container with id b5b194a72082703ba3043dca58b4d7c942cd697579ff87ecdc09aafd304ede8f Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.807957 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59f7db47d5-mbc9q"] Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.808299 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" podUID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" containerName="dnsmasq-dns" containerID="cri-o://5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451" gracePeriod=10 Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.827133 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerStarted","Data":"c12f4bdddf37a1ef5e860426fe948bdb51ce3f0edf70735ec8c9ab110cb59d7e"} Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.830232 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0","Type":"ContainerStarted","Data":"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd"} Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.840557 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-58ghr" event={"ID":"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5","Type":"ContainerStarted","Data":"b5b194a72082703ba3043dca58b4d7c942cd697579ff87ecdc09aafd304ede8f"} Nov 26 13:54:07 crc kubenswrapper[5200]: I1126 13:54:07.873245 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.873218101 podStartE2EDuration="2.873218101s" podCreationTimestamp="2025-11-26 13:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:54:07.865828492 +0000 UTC m=+1416.545030310" watchObservedRunningTime="2025-11-26 13:54:07.873218101 +0000 UTC m=+1416.552419909" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.375189 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.515582 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8ldc\" (UniqueName: \"kubernetes.io/projected/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-kube-api-access-f8ldc\") pod \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.516019 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-nb\") pod \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.516044 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-sb\") pod \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.516086 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-svc\") pod \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.516119 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-config\") pod \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.516189 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-swift-storage-0\") pod \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\" (UID: \"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6\") " Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.546537 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-kube-api-access-f8ldc" (OuterVolumeSpecName: "kube-api-access-f8ldc") pod "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" (UID: "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6"). InnerVolumeSpecName "kube-api-access-f8ldc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.598923 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-config" (OuterVolumeSpecName: "config") pod "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" (UID: "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.603247 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" (UID: "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.613407 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" (UID: "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.613503 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" (UID: "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.620006 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8ldc\" (UniqueName: \"kubernetes.io/projected/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-kube-api-access-f8ldc\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.620140 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.620220 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.620280 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.620341 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.658509 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" (UID: "0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.722610 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.852307 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-58ghr" event={"ID":"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5","Type":"ContainerStarted","Data":"8af815e72ea77b3056855c0473376215523b1abe17dce32a1fe4a0577d3966db"} Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.856488 5200 generic.go:358] "Generic (PLEG): container finished" podID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" containerID="5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451" exitCode=0 Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.857320 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.859832 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" event={"ID":"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6","Type":"ContainerDied","Data":"5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451"} Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.859902 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f7db47d5-mbc9q" event={"ID":"0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6","Type":"ContainerDied","Data":"cec617543077dcadca6beb0fc285732fa9f9b5ce5ffd1192757be46d5af7bbc4"} Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.859927 5200 scope.go:117] "RemoveContainer" containerID="5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.880528 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-58ghr" podStartSLOduration=2.8805047630000002 podStartE2EDuration="2.880504763s" podCreationTimestamp="2025-11-26 13:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:54:08.870887968 +0000 UTC m=+1417.550089796" watchObservedRunningTime="2025-11-26 13:54:08.880504763 +0000 UTC m=+1417.559706581" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.933798 5200 scope.go:117] "RemoveContainer" containerID="7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.945901 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59f7db47d5-mbc9q"] Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.963875 5200 scope.go:117] "RemoveContainer" containerID="5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.964551 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59f7db47d5-mbc9q"] Nov 26 13:54:08 crc kubenswrapper[5200]: E1126 13:54:08.964569 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451\": container with ID starting with 5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451 not found: ID does not exist" containerID="5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.964663 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451"} err="failed to get container status \"5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451\": rpc error: code = NotFound desc = could not find container \"5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451\": container with ID starting with 5bb1e0c9b05bec2760a51e469f2d673bdb0af683e40c02fee2cbf1f82255c451 not found: ID does not exist" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.964788 5200 scope.go:117] "RemoveContainer" containerID="7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b" Nov 26 13:54:08 crc kubenswrapper[5200]: E1126 13:54:08.965416 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b\": container with ID starting with 7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b not found: ID does not exist" containerID="7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.965453 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b"} err="failed to get container status \"7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b\": rpc error: code = NotFound desc = could not find container \"7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b\": container with ID starting with 7252b0ba48f1cdb02854f32ea16426c133caf3a0c190bbb34fb0cc4dd136a45b not found: ID does not exist" Nov 26 13:54:08 crc kubenswrapper[5200]: I1126 13:54:08.980758 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" path="/var/lib/kubelet/pods/0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6/volumes" Nov 26 13:54:09 crc kubenswrapper[5200]: I1126 13:54:09.899281 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerStarted","Data":"b9326640d2dd0dc2cb465cceabfeb8666c56ac7ec888f7a972c9d1baf978168e"} Nov 26 13:54:09 crc kubenswrapper[5200]: I1126 13:54:09.899916 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 13:54:09 crc kubenswrapper[5200]: I1126 13:54:09.935175 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.739334037 podStartE2EDuration="6.935155652s" podCreationTimestamp="2025-11-26 13:54:03 +0000 UTC" firstStartedPulling="2025-11-26 13:54:04.737901933 +0000 UTC m=+1413.417103751" lastFinishedPulling="2025-11-26 13:54:08.933723548 +0000 UTC m=+1417.612925366" observedRunningTime="2025-11-26 13:54:09.926336157 +0000 UTC m=+1418.605537995" watchObservedRunningTime="2025-11-26 13:54:09.935155652 +0000 UTC m=+1418.614357470" Nov 26 13:54:13 crc kubenswrapper[5200]: I1126 13:54:13.960005 5200 generic.go:358] "Generic (PLEG): container finished" podID="0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" containerID="8af815e72ea77b3056855c0473376215523b1abe17dce32a1fe4a0577d3966db" exitCode=0 Nov 26 13:54:13 crc kubenswrapper[5200]: I1126 13:54:13.960362 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-58ghr" event={"ID":"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5","Type":"ContainerDied","Data":"8af815e72ea77b3056855c0473376215523b1abe17dce32a1fe4a0577d3966db"} Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.459201 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.588819 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-config-data\") pod \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.588915 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-scripts\") pod \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.588968 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwkcx\" (UniqueName: \"kubernetes.io/projected/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-kube-api-access-hwkcx\") pod \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.589058 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-combined-ca-bundle\") pod \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\" (UID: \"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5\") " Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.597124 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-kube-api-access-hwkcx" (OuterVolumeSpecName: "kube-api-access-hwkcx") pod "0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" (UID: "0bbe3ea8-34d8-4d29-a552-f7408d9a66d5"). InnerVolumeSpecName "kube-api-access-hwkcx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.598138 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-scripts" (OuterVolumeSpecName: "scripts") pod "0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" (UID: "0bbe3ea8-34d8-4d29-a552-f7408d9a66d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.627313 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-config-data" (OuterVolumeSpecName: "config-data") pod "0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" (UID: "0bbe3ea8-34d8-4d29-a552-f7408d9a66d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.629305 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" (UID: "0bbe3ea8-34d8-4d29-a552-f7408d9a66d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.691105 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.691156 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.691165 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwkcx\" (UniqueName: \"kubernetes.io/projected/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-kube-api-access-hwkcx\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.691176 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.995215 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-58ghr" Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.995217 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-58ghr" event={"ID":"0bbe3ea8-34d8-4d29-a552-f7408d9a66d5","Type":"ContainerDied","Data":"b5b194a72082703ba3043dca58b4d7c942cd697579ff87ecdc09aafd304ede8f"} Nov 26 13:54:15 crc kubenswrapper[5200]: I1126 13:54:15.995282 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5b194a72082703ba3043dca58b4d7c942cd697579ff87ecdc09aafd304ede8f" Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.218706 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.220156 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f4579235-0d34-4e19-bea7-7c0aac805971" containerName="nova-scheduler-scheduler" containerID="cri-o://5fea24c254804c79e5df7d26759ac48074dfc197acbf27ba5f6dfbbeff994912" gracePeriod=30 Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.240795 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.241159 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerName="nova-api-log" containerID="cri-o://bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4" gracePeriod=30 Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.241293 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerName="nova-api-api" containerID="cri-o://55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd" gracePeriod=30 Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.256103 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.258972 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-log" containerID="cri-o://b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a" gracePeriod=30 Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.259126 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-metadata" containerID="cri-o://bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2" gracePeriod=30 Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.871727 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.916169 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-public-tls-certs\") pod \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.916253 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-internal-tls-certs\") pod \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.916373 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-combined-ca-bundle\") pod \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.916404 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st8kb\" (UniqueName: \"kubernetes.io/projected/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-kube-api-access-st8kb\") pod \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.916522 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-config-data\") pod \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.916737 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-logs\") pod \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\" (UID: \"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0\") " Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.917324 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-logs" (OuterVolumeSpecName: "logs") pod "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" (UID: "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.944405 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-kube-api-access-st8kb" (OuterVolumeSpecName: "kube-api-access-st8kb") pod "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" (UID: "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0"). InnerVolumeSpecName "kube-api-access-st8kb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.957043 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-config-data" (OuterVolumeSpecName: "config-data") pod "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" (UID: "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.960795 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" (UID: "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.988453 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" (UID: "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:16 crc kubenswrapper[5200]: I1126 13:54:16.990431 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" (UID: "2b6ad681-8b7d-459f-ab2a-1130ab8c96b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.008128 5200 generic.go:358] "Generic (PLEG): container finished" podID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerID="55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd" exitCode=0 Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.008166 5200 generic.go:358] "Generic (PLEG): container finished" podID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerID="bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4" exitCode=143 Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.008207 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0","Type":"ContainerDied","Data":"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd"} Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.008273 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0","Type":"ContainerDied","Data":"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4"} Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.008286 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b6ad681-8b7d-459f-ab2a-1130ab8c96b0","Type":"ContainerDied","Data":"49afc7156975318249de4eafb6bcbc630900ccf87a73dff7eb772604a536add8"} Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.008288 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.008307 5200 scope.go:117] "RemoveContainer" containerID="55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.025507 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.025942 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.026059 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.026269 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.026363 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-st8kb\" (UniqueName: \"kubernetes.io/projected/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-kube-api-access-st8kb\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.026509 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.028370 5200 generic.go:358] "Generic (PLEG): container finished" podID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerID="b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a" exitCode=143 Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.028987 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42c0a167-4edb-4321-b3c6-3790685f03e2","Type":"ContainerDied","Data":"b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a"} Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.055701 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.067677 5200 scope.go:117] "RemoveContainer" containerID="bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.072929 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.105576 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106801 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" containerName="dnsmasq-dns" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106820 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" containerName="dnsmasq-dns" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106853 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerName="nova-api-log" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106859 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerName="nova-api-log" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106869 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" containerName="init" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106874 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" containerName="init" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106881 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerName="nova-api-api" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106886 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerName="nova-api-api" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106896 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" containerName="nova-manage" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.106910 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" containerName="nova-manage" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.107062 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" containerName="nova-manage" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.107074 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cf581c0-f75b-4563-bcbf-8f0e56ae6ed6" containerName="dnsmasq-dns" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.107082 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerName="nova-api-log" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.107096 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" containerName="nova-api-api" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.117999 5200 scope.go:117] "RemoveContainer" containerID="55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd" Nov 26 13:54:17 crc kubenswrapper[5200]: E1126 13:54:17.119207 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd\": container with ID starting with 55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd not found: ID does not exist" containerID="55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.119276 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd"} err="failed to get container status \"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd\": rpc error: code = NotFound desc = could not find container \"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd\": container with ID starting with 55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd not found: ID does not exist" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.119318 5200 scope.go:117] "RemoveContainer" containerID="bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4" Nov 26 13:54:17 crc kubenswrapper[5200]: E1126 13:54:17.121116 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4\": container with ID starting with bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4 not found: ID does not exist" containerID="bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.121174 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4"} err="failed to get container status \"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4\": rpc error: code = NotFound desc = could not find container \"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4\": container with ID starting with bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4 not found: ID does not exist" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.121206 5200 scope.go:117] "RemoveContainer" containerID="55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.122950 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd"} err="failed to get container status \"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd\": rpc error: code = NotFound desc = could not find container \"55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd\": container with ID starting with 55cc6126a6a0008bef5a2f22fb6d37a9c190099d03c5a66f58d979f4fabb23bd not found: ID does not exist" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.123020 5200 scope.go:117] "RemoveContainer" containerID="bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.123954 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4"} err="failed to get container status \"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4\": rpc error: code = NotFound desc = could not find container \"bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4\": container with ID starting with bf4f7adbab65e906932574c4734035f1239bbcf818b30bb590002115fd7356a4 not found: ID does not exist" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.132026 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.132214 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.136346 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.137098 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-internal-svc\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.137305 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-public-svc\"" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.236897 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.236972 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kwxc\" (UniqueName: \"kubernetes.io/projected/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-kube-api-access-2kwxc\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.237143 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.237279 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-logs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.237859 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.238002 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-config-data\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.340472 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.340563 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-config-data\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.340616 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.340677 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kwxc\" (UniqueName: \"kubernetes.io/projected/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-kube-api-access-2kwxc\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.340728 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.340764 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-logs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.341287 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-logs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.346660 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.347107 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.348146 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-config-data\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.348818 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.372028 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kwxc\" (UniqueName: \"kubernetes.io/projected/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-kube-api-access-2kwxc\") pod \"nova-api-0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.460273 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:54:17 crc kubenswrapper[5200]: I1126 13:54:17.934330 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:54:17 crc kubenswrapper[5200]: W1126 13:54:17.942196 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2c7ebc3_a1ef_4244_8fc8_a46aca7a1ee0.slice/crio-d7feaa0e0a943cd5ff1049b449daef3bbd78f5a0c53dcbd0016828ef49824f9b WatchSource:0}: Error finding container d7feaa0e0a943cd5ff1049b449daef3bbd78f5a0c53dcbd0016828ef49824f9b: Status 404 returned error can't find the container with id d7feaa0e0a943cd5ff1049b449daef3bbd78f5a0c53dcbd0016828ef49824f9b Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.049600 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0","Type":"ContainerStarted","Data":"d7feaa0e0a943cd5ff1049b449daef3bbd78f5a0c53dcbd0016828ef49824f9b"} Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.056258 5200 generic.go:358] "Generic (PLEG): container finished" podID="f4579235-0d34-4e19-bea7-7c0aac805971" containerID="5fea24c254804c79e5df7d26759ac48074dfc197acbf27ba5f6dfbbeff994912" exitCode=0 Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.056339 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f4579235-0d34-4e19-bea7-7c0aac805971","Type":"ContainerDied","Data":"5fea24c254804c79e5df7d26759ac48074dfc197acbf27ba5f6dfbbeff994912"} Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.207857 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.360441 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-combined-ca-bundle\") pod \"f4579235-0d34-4e19-bea7-7c0aac805971\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.360635 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-config-data\") pod \"f4579235-0d34-4e19-bea7-7c0aac805971\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.360707 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfz9s\" (UniqueName: \"kubernetes.io/projected/f4579235-0d34-4e19-bea7-7c0aac805971-kube-api-access-qfz9s\") pod \"f4579235-0d34-4e19-bea7-7c0aac805971\" (UID: \"f4579235-0d34-4e19-bea7-7c0aac805971\") " Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.367056 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4579235-0d34-4e19-bea7-7c0aac805971-kube-api-access-qfz9s" (OuterVolumeSpecName: "kube-api-access-qfz9s") pod "f4579235-0d34-4e19-bea7-7c0aac805971" (UID: "f4579235-0d34-4e19-bea7-7c0aac805971"). InnerVolumeSpecName "kube-api-access-qfz9s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.400254 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-config-data" (OuterVolumeSpecName: "config-data") pod "f4579235-0d34-4e19-bea7-7c0aac805971" (UID: "f4579235-0d34-4e19-bea7-7c0aac805971"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.409052 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4579235-0d34-4e19-bea7-7c0aac805971" (UID: "f4579235-0d34-4e19-bea7-7c0aac805971"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.463407 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.463457 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4579235-0d34-4e19-bea7-7c0aac805971-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.463470 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfz9s\" (UniqueName: \"kubernetes.io/projected/f4579235-0d34-4e19-bea7-7c0aac805971-kube-api-access-qfz9s\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:18 crc kubenswrapper[5200]: I1126 13:54:18.982802 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6ad681-8b7d-459f-ab2a-1130ab8c96b0" path="/var/lib/kubelet/pods/2b6ad681-8b7d-459f-ab2a-1130ab8c96b0/volumes" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.069378 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0","Type":"ContainerStarted","Data":"78be597c70fe60222ad7f16fc5b17c8bb5fb9f375d97a8edf8fce01c8f2a8af4"} Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.069464 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0","Type":"ContainerStarted","Data":"913da7fa54a8dadf2cb1d0f96e7c1a4c10b128d92a72d6916595429a588b3727"} Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.072579 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.072627 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f4579235-0d34-4e19-bea7-7c0aac805971","Type":"ContainerDied","Data":"c747f0af0159afdf65f2fbf0ec79edd2efed5687d24140675154d64089e7124b"} Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.072678 5200 scope.go:117] "RemoveContainer" containerID="5fea24c254804c79e5df7d26759ac48074dfc197acbf27ba5f6dfbbeff994912" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.095094 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.095072972 podStartE2EDuration="2.095072972s" podCreationTimestamp="2025-11-26 13:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:54:19.094316512 +0000 UTC m=+1427.773518400" watchObservedRunningTime="2025-11-26 13:54:19.095072972 +0000 UTC m=+1427.774274790" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.124047 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.150638 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.161370 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.162790 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4579235-0d34-4e19-bea7-7c0aac805971" containerName="nova-scheduler-scheduler" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.162812 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4579235-0d34-4e19-bea7-7c0aac805971" containerName="nova-scheduler-scheduler" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.163006 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4579235-0d34-4e19-bea7-7c0aac805971" containerName="nova-scheduler-scheduler" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.175613 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.175807 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.179636 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.283751 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhvm\" (UniqueName: \"kubernetes.io/projected/adab4e53-c668-44cf-be5d-5802a77124aa-kube-api-access-xkhvm\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.284197 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-config-data\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.284324 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.386093 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhvm\" (UniqueName: \"kubernetes.io/projected/adab4e53-c668-44cf-be5d-5802a77124aa-kube-api-access-xkhvm\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.386604 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-config-data\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.387374 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.394134 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.395075 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-config-data\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.412895 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhvm\" (UniqueName: \"kubernetes.io/projected/adab4e53-c668-44cf-be5d-5802a77124aa-kube-api-access-xkhvm\") pod \"nova-scheduler-0\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.545587 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:54:19 crc kubenswrapper[5200]: I1126 13:54:19.826560 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.000405 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-config-data\") pod \"42c0a167-4edb-4321-b3c6-3790685f03e2\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.000490 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-combined-ca-bundle\") pod \"42c0a167-4edb-4321-b3c6-3790685f03e2\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.000590 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c0a167-4edb-4321-b3c6-3790685f03e2-logs\") pod \"42c0a167-4edb-4321-b3c6-3790685f03e2\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.000752 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqv7n\" (UniqueName: \"kubernetes.io/projected/42c0a167-4edb-4321-b3c6-3790685f03e2-kube-api-access-xqv7n\") pod \"42c0a167-4edb-4321-b3c6-3790685f03e2\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.000885 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-nova-metadata-tls-certs\") pod \"42c0a167-4edb-4321-b3c6-3790685f03e2\" (UID: \"42c0a167-4edb-4321-b3c6-3790685f03e2\") " Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.003658 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c0a167-4edb-4321-b3c6-3790685f03e2-logs" (OuterVolumeSpecName: "logs") pod "42c0a167-4edb-4321-b3c6-3790685f03e2" (UID: "42c0a167-4edb-4321-b3c6-3790685f03e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.006519 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c0a167-4edb-4321-b3c6-3790685f03e2-kube-api-access-xqv7n" (OuterVolumeSpecName: "kube-api-access-xqv7n") pod "42c0a167-4edb-4321-b3c6-3790685f03e2" (UID: "42c0a167-4edb-4321-b3c6-3790685f03e2"). InnerVolumeSpecName "kube-api-access-xqv7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.039599 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.042938 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-config-data" (OuterVolumeSpecName: "config-data") pod "42c0a167-4edb-4321-b3c6-3790685f03e2" (UID: "42c0a167-4edb-4321-b3c6-3790685f03e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:20 crc kubenswrapper[5200]: W1126 13:54:20.044475 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadab4e53_c668_44cf_be5d_5802a77124aa.slice/crio-766214c0e0a3d2c44051458267145692b94ccfa19f1796075fe1fd732a8f4575 WatchSource:0}: Error finding container 766214c0e0a3d2c44051458267145692b94ccfa19f1796075fe1fd732a8f4575: Status 404 returned error can't find the container with id 766214c0e0a3d2c44051458267145692b94ccfa19f1796075fe1fd732a8f4575 Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.050460 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42c0a167-4edb-4321-b3c6-3790685f03e2" (UID: "42c0a167-4edb-4321-b3c6-3790685f03e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.086046 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "42c0a167-4edb-4321-b3c6-3790685f03e2" (UID: "42c0a167-4edb-4321-b3c6-3790685f03e2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.088864 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adab4e53-c668-44cf-be5d-5802a77124aa","Type":"ContainerStarted","Data":"766214c0e0a3d2c44051458267145692b94ccfa19f1796075fe1fd732a8f4575"} Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.092644 5200 generic.go:358] "Generic (PLEG): container finished" podID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerID="bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2" exitCode=0 Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.092770 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42c0a167-4edb-4321-b3c6-3790685f03e2","Type":"ContainerDied","Data":"bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2"} Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.092853 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.092864 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42c0a167-4edb-4321-b3c6-3790685f03e2","Type":"ContainerDied","Data":"72a4d137f01fb460e34de9964af075f46a0d1d2c0a7426665e876216619911c0"} Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.092909 5200 scope.go:117] "RemoveContainer" containerID="bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.103901 5200 reconciler_common.go:299] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.103940 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.103955 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c0a167-4edb-4321-b3c6-3790685f03e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.103965 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c0a167-4edb-4321-b3c6-3790685f03e2-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.103974 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqv7n\" (UniqueName: \"kubernetes.io/projected/42c0a167-4edb-4321-b3c6-3790685f03e2-kube-api-access-xqv7n\") on node \"crc\" DevicePath \"\"" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.146823 5200 scope.go:117] "RemoveContainer" containerID="b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.155979 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.169417 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.182950 5200 scope.go:117] "RemoveContainer" containerID="bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2" Nov 26 13:54:20 crc kubenswrapper[5200]: E1126 13:54:20.184976 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2\": container with ID starting with bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2 not found: ID does not exist" containerID="bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.185020 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2"} err="failed to get container status \"bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2\": rpc error: code = NotFound desc = could not find container \"bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2\": container with ID starting with bbd5645a1fc2e64f53f4d98f92c0c94cf063f797373f7210c75508792d0541c2 not found: ID does not exist" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.185050 5200 scope.go:117] "RemoveContainer" containerID="b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a" Nov 26 13:54:20 crc kubenswrapper[5200]: E1126 13:54:20.185389 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a\": container with ID starting with b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a not found: ID does not exist" containerID="b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.185490 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a"} err="failed to get container status \"b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a\": rpc error: code = NotFound desc = could not find container \"b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a\": container with ID starting with b02cb835ac82dbea53a6a0fb42b7c3cf9a39d75a65d43d4516e3cd6a0bd06d2a not found: ID does not exist" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.194962 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.196241 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-metadata" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.196265 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-metadata" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.196286 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-log" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.196294 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-log" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.196520 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-log" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.196552 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" containerName="nova-metadata-metadata" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.203157 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.206966 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-metadata-internal-svc\"" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.210085 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.211933 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.307793 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txq4\" (UniqueName: \"kubernetes.io/projected/8484d8df-fa24-4e8e-85ba-c318fc6cde61-kube-api-access-4txq4\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.308023 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.308107 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.308203 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-config-data\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.308847 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8484d8df-fa24-4e8e-85ba-c318fc6cde61-logs\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.411183 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4txq4\" (UniqueName: \"kubernetes.io/projected/8484d8df-fa24-4e8e-85ba-c318fc6cde61-kube-api-access-4txq4\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.411354 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.412713 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.412787 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-config-data\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.413060 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8484d8df-fa24-4e8e-85ba-c318fc6cde61-logs\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.413772 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8484d8df-fa24-4e8e-85ba-c318fc6cde61-logs\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.417860 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.417874 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.421101 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-config-data\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.433634 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txq4\" (UniqueName: \"kubernetes.io/projected/8484d8df-fa24-4e8e-85ba-c318fc6cde61-kube-api-access-4txq4\") pod \"nova-metadata-0\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.559963 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.984842 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c0a167-4edb-4321-b3c6-3790685f03e2" path="/var/lib/kubelet/pods/42c0a167-4edb-4321-b3c6-3790685f03e2/volumes" Nov 26 13:54:20 crc kubenswrapper[5200]: I1126 13:54:20.987584 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4579235-0d34-4e19-bea7-7c0aac805971" path="/var/lib/kubelet/pods/f4579235-0d34-4e19-bea7-7c0aac805971/volumes" Nov 26 13:54:21 crc kubenswrapper[5200]: I1126 13:54:21.124166 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:54:21 crc kubenswrapper[5200]: I1126 13:54:21.143919 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adab4e53-c668-44cf-be5d-5802a77124aa","Type":"ContainerStarted","Data":"76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba"} Nov 26 13:54:21 crc kubenswrapper[5200]: I1126 13:54:21.175410 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.175390952 podStartE2EDuration="2.175390952s" podCreationTimestamp="2025-11-26 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:54:21.171137171 +0000 UTC m=+1429.850338999" watchObservedRunningTime="2025-11-26 13:54:21.175390952 +0000 UTC m=+1429.854592770" Nov 26 13:54:22 crc kubenswrapper[5200]: I1126 13:54:22.174103 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8484d8df-fa24-4e8e-85ba-c318fc6cde61","Type":"ContainerStarted","Data":"994d0d2f29569bced7df7f345325a897433ec18de7893657a1375fbcb53bacf2"} Nov 26 13:54:22 crc kubenswrapper[5200]: I1126 13:54:22.177647 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8484d8df-fa24-4e8e-85ba-c318fc6cde61","Type":"ContainerStarted","Data":"6403883701f7e5dc62ff1b903104a5ab99aba9e00d1492955efcacde021d3e12"} Nov 26 13:54:22 crc kubenswrapper[5200]: I1126 13:54:22.177874 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8484d8df-fa24-4e8e-85ba-c318fc6cde61","Type":"ContainerStarted","Data":"138a46eb895f388cbd9ac7e54d166edebac46d5e12c56217fc148160c245f104"} Nov 26 13:54:22 crc kubenswrapper[5200]: I1126 13:54:22.202151 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.202076667 podStartE2EDuration="2.202076667s" podCreationTimestamp="2025-11-26 13:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:54:22.192325414 +0000 UTC m=+1430.871527242" watchObservedRunningTime="2025-11-26 13:54:22.202076667 +0000 UTC m=+1430.881278485" Nov 26 13:54:24 crc kubenswrapper[5200]: I1126 13:54:24.549606 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Nov 26 13:54:25 crc kubenswrapper[5200]: I1126 13:54:25.560985 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 13:54:25 crc kubenswrapper[5200]: I1126 13:54:25.561039 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 13:54:27 crc kubenswrapper[5200]: I1126 13:54:27.460997 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 13:54:27 crc kubenswrapper[5200]: I1126 13:54:27.461749 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 13:54:28 crc kubenswrapper[5200]: I1126 13:54:28.467862 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 13:54:28 crc kubenswrapper[5200]: I1126 13:54:28.467928 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 13:54:29 crc kubenswrapper[5200]: I1126 13:54:29.545881 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 13:54:29 crc kubenswrapper[5200]: I1126 13:54:29.583587 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 13:54:30 crc kubenswrapper[5200]: I1126 13:54:30.329307 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 13:54:30 crc kubenswrapper[5200]: I1126 13:54:30.560444 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 13:54:30 crc kubenswrapper[5200]: I1126 13:54:30.560514 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 13:54:31 crc kubenswrapper[5200]: I1126 13:54:31.569915 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 13:54:31 crc kubenswrapper[5200]: I1126 13:54:31.569904 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 13:54:33 crc kubenswrapper[5200]: I1126 13:54:33.154381 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:54:33 crc kubenswrapper[5200]: I1126 13:54:33.154539 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:54:33 crc kubenswrapper[5200]: I1126 13:54:33.154616 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:54:33 crc kubenswrapper[5200]: I1126 13:54:33.155894 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c9e2fd96d697c529382ab47187145fbd161a7e2364741d5b8b26afa12ecb0a5"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 13:54:33 crc kubenswrapper[5200]: I1126 13:54:33.156022 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://8c9e2fd96d697c529382ab47187145fbd161a7e2364741d5b8b26afa12ecb0a5" gracePeriod=600 Nov 26 13:54:33 crc kubenswrapper[5200]: I1126 13:54:33.321736 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="8c9e2fd96d697c529382ab47187145fbd161a7e2364741d5b8b26afa12ecb0a5" exitCode=0 Nov 26 13:54:33 crc kubenswrapper[5200]: I1126 13:54:33.321841 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"8c9e2fd96d697c529382ab47187145fbd161a7e2364741d5b8b26afa12ecb0a5"} Nov 26 13:54:33 crc kubenswrapper[5200]: I1126 13:54:33.321902 5200 scope.go:117] "RemoveContainer" containerID="9374219ae715a3e96430de37c40ed96a5c875650a0a5602a0ef3886cef886129" Nov 26 13:54:34 crc kubenswrapper[5200]: I1126 13:54:34.341246 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970"} Nov 26 13:54:37 crc kubenswrapper[5200]: I1126 13:54:37.467944 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 13:54:37 crc kubenswrapper[5200]: I1126 13:54:37.469108 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 13:54:37 crc kubenswrapper[5200]: I1126 13:54:37.471316 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 13:54:37 crc kubenswrapper[5200]: I1126 13:54:37.479477 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 13:54:38 crc kubenswrapper[5200]: I1126 13:54:38.392600 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 13:54:38 crc kubenswrapper[5200]: I1126 13:54:38.402998 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 13:54:40 crc kubenswrapper[5200]: I1126 13:54:40.569428 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 13:54:40 crc kubenswrapper[5200]: I1126 13:54:40.570246 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 13:54:40 crc kubenswrapper[5200]: I1126 13:54:40.578807 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 13:54:40 crc kubenswrapper[5200]: I1126 13:54:40.578957 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 13:54:40 crc kubenswrapper[5200]: I1126 13:54:40.935993 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 13:54:42 crc kubenswrapper[5200]: E1126 13:54:42.663658 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1786156 actualBytes=10240 Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.015755 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.016594 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" containerName="openstackclient" containerID="cri-o://819c90649cfd578627699cf1864ff703ef1f85ca28a43cafd1731fbe6e832be9" gracePeriod=2 Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.027905 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.046505 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.400321 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.435933 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.436969 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" containerName="openstack-network-exporter" containerID="cri-o://76f45a7a6ec5de6fe10745e16931f137f558e7423c3c48a60e8dae410a4c952b" gracePeriod=300 Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.469809 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.475886 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-g4hpw"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.502975 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-g4hpw"] Nov 26 13:55:02 crc kubenswrapper[5200]: E1126 13:55:02.525341 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 26 13:55:02 crc kubenswrapper[5200]: E1126 13:55:02.525406 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data podName:5f5ea8d4-ba2b-4abe-b645-0513268421f9 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:03.025389001 +0000 UTC m=+1471.704590819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data") pod "rabbitmq-server-0" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9") : configmap "rabbitmq-config-data" not found Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.531546 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-l8nht"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.563908 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-l8nht"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.602332 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance4584-account-delete-7mnx9"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.603732 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" containerName="openstackclient" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.603758 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" containerName="openstackclient" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.603982 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" containerName="openstackclient" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.620218 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.624209 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.629865 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance4584-account-delete-7mnx9"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.646213 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinderf459-account-delete-h8qgw"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.658066 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.691517 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinderf459-account-delete-h8qgw"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.740649 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" containerName="ovsdbserver-sb" containerID="cri-o://f5e60a029fe6708251acff52c633b481ec59ead862a791d6d3ca76ff77b39620" gracePeriod=300 Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.748335 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.785145 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-operator-scripts\") pod \"glance4584-account-delete-7mnx9\" (UID: \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\") " pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.785366 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab255459-89c5-462e-a89d-eb26ba377f79-operator-scripts\") pod \"cinderf459-account-delete-h8qgw\" (UID: \"ab255459-89c5-462e-a89d-eb26ba377f79\") " pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.785505 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb7xk\" (UniqueName: \"kubernetes.io/projected/ab255459-89c5-462e-a89d-eb26ba377f79-kube-api-access-hb7xk\") pod \"cinderf459-account-delete-h8qgw\" (UID: \"ab255459-89c5-462e-a89d-eb26ba377f79\") " pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.785566 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqbh\" (UniqueName: \"kubernetes.io/projected/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-kube-api-access-xxqbh\") pod \"glance4584-account-delete-7mnx9\" (UID: \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\") " pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.880363 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.880782 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="ovn-northd" containerID="cri-o://70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" gracePeriod=30 Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.881442 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="openstack-network-exporter" containerID="cri-o://5a4f406803509ed4c9f81a25d1d85f51801ee9e253c61023a592d2368060e342" gracePeriod=30 Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.897093 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb7xk\" (UniqueName: \"kubernetes.io/projected/ab255459-89c5-462e-a89d-eb26ba377f79-kube-api-access-hb7xk\") pod \"cinderf459-account-delete-h8qgw\" (UID: \"ab255459-89c5-462e-a89d-eb26ba377f79\") " pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.897212 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqbh\" (UniqueName: \"kubernetes.io/projected/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-kube-api-access-xxqbh\") pod \"glance4584-account-delete-7mnx9\" (UID: \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\") " pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.897359 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-operator-scripts\") pod \"glance4584-account-delete-7mnx9\" (UID: \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\") " pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.897633 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab255459-89c5-462e-a89d-eb26ba377f79-operator-scripts\") pod \"cinderf459-account-delete-h8qgw\" (UID: \"ab255459-89c5-462e-a89d-eb26ba377f79\") " pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.907732 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab255459-89c5-462e-a89d-eb26ba377f79-operator-scripts\") pod \"cinderf459-account-delete-h8qgw\" (UID: \"ab255459-89c5-462e-a89d-eb26ba377f79\") " pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.921926 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placemente6ea-account-delete-s2v6r"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.923050 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-operator-scripts\") pod \"glance4584-account-delete-7mnx9\" (UID: \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\") " pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.959699 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.963936 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.965727 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.969953 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:02 crc kubenswrapper[5200]: I1126 13:55:02.989146 5200 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/glance-default-external-api-0" secret="" err="secret \"glance-glance-dockercfg-q4wxh\" not found" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.046397 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb7xk\" (UniqueName: \"kubernetes.io/projected/ab255459-89c5-462e-a89d-eb26ba377f79-kube-api-access-hb7xk\") pod \"cinderf459-account-delete-h8qgw\" (UID: \"ab255459-89c5-462e-a89d-eb26ba377f79\") " pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.080818 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a5e8b8-15b7-4f2b-a60f-3c28341cbe01" path="/var/lib/kubelet/pods/71a5e8b8-15b7-4f2b-a60f-3c28341cbe01/volumes" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.081712 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3dd593-fa3c-491a-9f97-84f0dee04150" path="/var/lib/kubelet/pods/dd3dd593-fa3c-491a-9f97-84f0dee04150/volumes" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.082365 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placemente6ea-account-delete-s2v6r"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.082390 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-qp469"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.082402 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-qp469"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.082418 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8vf92"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.082632 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-8vf92" podUID="82b72f68-40bd-4a81-914a-dda4355b9fc4" containerName="openstack-network-exporter" containerID="cri-o://bf61bc7bfe41e1213dbebfb173eaf919afb90cb614da617b3eb37e70e0d8b683" gracePeriod=30 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.096386 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqbh\" (UniqueName: \"kubernetes.io/projected/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-kube-api-access-xxqbh\") pod \"glance4584-account-delete-7mnx9\" (UID: \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\") " pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.109742 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-qk25n"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.112335 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsz2q\" (UniqueName: \"kubernetes.io/projected/05534089-21c2-4d8c-9b9e-d21c7082e727-kube-api-access-bsz2q\") pod \"placemente6ea-account-delete-s2v6r\" (UID: \"05534089-21c2-4d8c-9b9e-d21c7082e727\") " pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.113959 5200 secret.go:189] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.114061 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data podName:4bdf37de-cbcd-4971-a688-f3b91cf98b47 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:03.614037288 +0000 UTC m=+1472.293239106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data") pod "glance-default-external-api-0" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47") : secret "glance-default-external-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.114110 5200 secret.go:189] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.114130 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts podName:4bdf37de-cbcd-4971-a688-f3b91cf98b47 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:03.61412335 +0000 UTC m=+1472.293325168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts") pod "glance-default-external-api-0" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47") : secret "glance-scripts" not found Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.128078 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kp5s8"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.131765 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05534089-21c2-4d8c-9b9e-d21c7082e727-operator-scripts\") pod \"placemente6ea-account-delete-s2v6r\" (UID: \"05534089-21c2-4d8c-9b9e-d21c7082e727\") " pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.132109 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.133354 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.135140 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.135207 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data podName:5f5ea8d4-ba2b-4abe-b645-0513268421f9 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:04.135185077 +0000 UTC m=+1472.814386895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data") pod "rabbitmq-server-0" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9") : configmap "rabbitmq-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.165335 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wn5kl"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.207098 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-wn5kl"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.233704 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05534089-21c2-4d8c-9b9e-d21c7082e727-operator-scripts\") pod \"placemente6ea-account-delete-s2v6r\" (UID: \"05534089-21c2-4d8c-9b9e-d21c7082e727\") " pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.233864 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsz2q\" (UniqueName: \"kubernetes.io/projected/05534089-21c2-4d8c-9b9e-d21c7082e727-kube-api-access-bsz2q\") pod \"placemente6ea-account-delete-s2v6r\" (UID: \"05534089-21c2-4d8c-9b9e-d21c7082e727\") " pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.234175 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.234236 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data podName:4c5390ed-d89e-4450-adf3-1363b1150d1b nodeName:}" failed. No retries permitted until 2025-11-26 13:55:03.734219307 +0000 UTC m=+1472.413421125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b") : configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.234995 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05534089-21c2-4d8c-9b9e-d21c7082e727-operator-scripts\") pod \"placemente6ea-account-delete-s2v6r\" (UID: \"05534089-21c2-4d8c-9b9e-d21c7082e727\") " pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.236770 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vk528"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.250350 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vk528"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.277481 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f7d474f5-w7lqk"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.278182 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" podUID="4972e45b-4085-4bbb-a052-95698f889c49" containerName="dnsmasq-dns" containerID="cri-o://e57e9a52f8cde4ed409c4daef24f8b64435e90ed6915c4ccca0b63cbaa9821a8" gracePeriod=10 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.300274 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutronca07-account-delete-xb4zf"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.331593 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsz2q\" (UniqueName: \"kubernetes.io/projected/05534089-21c2-4d8c-9b9e-d21c7082e727-kube-api-access-bsz2q\") pod \"placemente6ea-account-delete-s2v6r\" (UID: \"05534089-21c2-4d8c-9b9e-d21c7082e727\") " pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.336447 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutronca07-account-delete-xb4zf"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.336587 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.350782 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican0a47-account-delete-92fz8"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.371065 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.397287 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican0a47-account-delete-92fz8"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.397427 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.444808 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caee6c82-a418-4a1d-a364-0a0eb9073c09-operator-scripts\") pod \"neutronca07-account-delete-xb4zf\" (UID: \"caee6c82-a418-4a1d-a364-0a0eb9073c09\") " pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.444953 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4x8\" (UniqueName: \"kubernetes.io/projected/caee6c82-a418-4a1d-a364-0a0eb9073c09-kube-api-access-hd4x8\") pod \"neutronca07-account-delete-xb4zf\" (UID: \"caee6c82-a418-4a1d-a364-0a0eb9073c09\") " pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.501851 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.502187 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerName="cinder-scheduler" containerID="cri-o://5bb653ca071dc9a61e5ecf99b1fe3a17c36b5dd2807279eee649a3530583f056" gracePeriod=30 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.502712 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerName="probe" containerID="cri-o://cda7517c7cfab2ef565f5c3fa5201af6cdade851c6625c180d2015174e06c39f" gracePeriod=30 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.544149 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-pzmsj"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.555514 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-operator-scripts\") pod \"barbican0a47-account-delete-92fz8\" (UID: \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\") " pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.555944 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caee6c82-a418-4a1d-a364-0a0eb9073c09-operator-scripts\") pod \"neutronca07-account-delete-xb4zf\" (UID: \"caee6c82-a418-4a1d-a364-0a0eb9073c09\") " pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.556094 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj2g5\" (UniqueName: \"kubernetes.io/projected/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-kube-api-access-lj2g5\") pod \"barbican0a47-account-delete-92fz8\" (UID: \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\") " pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.556118 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4x8\" (UniqueName: \"kubernetes.io/projected/caee6c82-a418-4a1d-a364-0a0eb9073c09-kube-api-access-hd4x8\") pod \"neutronca07-account-delete-xb4zf\" (UID: \"caee6c82-a418-4a1d-a364-0a0eb9073c09\") " pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.571369 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caee6c82-a418-4a1d-a364-0a0eb9073c09-operator-scripts\") pod \"neutronca07-account-delete-xb4zf\" (UID: \"caee6c82-a418-4a1d-a364-0a0eb9073c09\") " pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.579716 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.593534 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4x8\" (UniqueName: \"kubernetes.io/projected/caee6c82-a418-4a1d-a364-0a0eb9073c09-kube-api-access-hd4x8\") pod \"neutronca07-account-delete-xb4zf\" (UID: \"caee6c82-a418-4a1d-a364-0a0eb9073c09\") " pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.628253 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.642022 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-pzmsj"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.659313 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj2g5\" (UniqueName: \"kubernetes.io/projected/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-kube-api-access-lj2g5\") pod \"barbican0a47-account-delete-92fz8\" (UID: \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\") " pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.659436 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-operator-scripts\") pod \"barbican0a47-account-delete-92fz8\" (UID: \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\") " pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.660426 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-operator-scripts\") pod \"barbican0a47-account-delete-92fz8\" (UID: \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\") " pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.660525 5200 secret.go:189] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.660573 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts podName:4bdf37de-cbcd-4971-a688-f3b91cf98b47 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:04.660557942 +0000 UTC m=+1473.339759760 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts") pod "glance-default-external-api-0" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47") : secret "glance-scripts" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.660925 5200 secret.go:189] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.660951 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data podName:4bdf37de-cbcd-4971-a688-f3b91cf98b47 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:04.660943862 +0000 UTC m=+1473.340145680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data") pod "glance-default-external-api-0" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47") : secret "glance-default-external-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.713133 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj2g5\" (UniqueName: \"kubernetes.io/projected/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-kube-api-access-lj2g5\") pod \"barbican0a47-account-delete-92fz8\" (UID: \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\") " pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.728819 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/novacell0e30c-account-delete-pnqrp"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.755986 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.772498 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: E1126 13:55:03.772575 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data podName:4c5390ed-d89e-4450-adf3-1363b1150d1b nodeName:}" failed. No retries permitted until 2025-11-26 13:55:04.772556038 +0000 UTC m=+1473.451757856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b") : configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.831491 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0e30c-account-delete-pnqrp"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.878352 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.878712 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api-log" containerID="cri-o://7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066" gracePeriod=30 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.879475 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api" containerID="cri-o://6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb" gracePeriod=30 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.885431 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cb616b41-7836-4e58-a88b-bb68df018d24/ovsdbserver-sb/0.log" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.885489 5200 generic.go:358] "Generic (PLEG): container finished" podID="cb616b41-7836-4e58-a88b-bb68df018d24" containerID="76f45a7a6ec5de6fe10745e16931f137f558e7423c3c48a60e8dae410a4c952b" exitCode=2 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.885510 5200 generic.go:358] "Generic (PLEG): container finished" podID="cb616b41-7836-4e58-a88b-bb68df018d24" containerID="f5e60a029fe6708251acff52c633b481ec59ead862a791d6d3ca76ff77b39620" exitCode=143 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.885635 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb616b41-7836-4e58-a88b-bb68df018d24","Type":"ContainerDied","Data":"76f45a7a6ec5de6fe10745e16931f137f558e7423c3c48a60e8dae410a4c952b"} Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.885664 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb616b41-7836-4e58-a88b-bb68df018d24","Type":"ContainerDied","Data":"f5e60a029fe6708251acff52c633b481ec59ead862a791d6d3ca76ff77b39620"} Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.886728 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr7wh\" (UniqueName: \"kubernetes.io/projected/6d529441-c1c2-4f95-bc63-ce522b9f1b07-kube-api-access-vr7wh\") pod \"novacell0e30c-account-delete-pnqrp\" (UID: \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\") " pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.893729 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts\") pod \"novacell0e30c-account-delete-pnqrp\" (UID: \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\") " pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.915429 5200 generic.go:358] "Generic (PLEG): container finished" podID="12479106-7c82-418c-b017-03e5fcc4f018" containerID="5a4f406803509ed4c9f81a25d1d85f51801ee9e253c61023a592d2368060e342" exitCode=2 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.915875 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"12479106-7c82-418c-b017-03e5fcc4f018","Type":"ContainerDied","Data":"5a4f406803509ed4c9f81a25d1d85f51801ee9e253c61023a592d2368060e342"} Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.955938 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-779774b9bd-gltc4"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.956544 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/placement-779774b9bd-gltc4" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-log" containerID="cri-o://01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e" gracePeriod=30 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.957365 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/placement-779774b9bd-gltc4" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-api" containerID="cri-o://4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce" gracePeriod=30 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.962514 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8vf92_82b72f68-40bd-4a81-914a-dda4355b9fc4/openstack-network-exporter/0.log" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.962635 5200 generic.go:358] "Generic (PLEG): container finished" podID="82b72f68-40bd-4a81-914a-dda4355b9fc4" containerID="bf61bc7bfe41e1213dbebfb173eaf919afb90cb614da617b3eb37e70e0d8b683" exitCode=2 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.962965 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8vf92" event={"ID":"82b72f68-40bd-4a81-914a-dda4355b9fc4","Type":"ContainerDied","Data":"bf61bc7bfe41e1213dbebfb173eaf919afb90cb614da617b3eb37e70e0d8b683"} Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.977971 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.978904 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="openstack-network-exporter" containerID="cri-o://ad8e5c8ff258934df8b031e4ca400c00da84dc8e84950692fb247eb689629ffd" gracePeriod=300 Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.989881 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:03 crc kubenswrapper[5200]: I1126 13:55:03.998959 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/novaapi5c5d-account-delete-n94tm"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.005037 5200 generic.go:358] "Generic (PLEG): container finished" podID="4972e45b-4085-4bbb-a052-95698f889c49" containerID="e57e9a52f8cde4ed409c4daef24f8b64435e90ed6915c4ccca0b63cbaa9821a8" exitCode=0 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.006455 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts\") pod \"novacell0e30c-account-delete-pnqrp\" (UID: \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\") " pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.007304 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts\") pod \"novacell0e30c-account-delete-pnqrp\" (UID: \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\") " pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.008253 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr7wh\" (UniqueName: \"kubernetes.io/projected/6d529441-c1c2-4f95-bc63-ce522b9f1b07-kube-api-access-vr7wh\") pod \"novacell0e30c-account-delete-pnqrp\" (UID: \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\") " pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.045434 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi5c5d-account-delete-n94tm"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.045479 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" event={"ID":"4972e45b-4085-4bbb-a052-95698f889c49","Type":"ContainerDied","Data":"e57e9a52f8cde4ed409c4daef24f8b64435e90ed6915c4ccca0b63cbaa9821a8"} Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.045508 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.045913 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-log" containerID="cri-o://7503a8964c681bf49d84540f8809922bb7d2d71d2f4e4f1b1afea676f29d0da7" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.046391 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-httpd" containerID="cri-o://03b29c837a967963b6848307dc062dac95cb5e7b79937c757a26948129ebd5ec" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.057975 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.058655 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-747455c75f-8zrqn"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.058882 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/neutron-747455c75f-8zrqn" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-api" containerID="cri-o://d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.058982 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/neutron-747455c75f-8zrqn" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-httpd" containerID="cri-o://50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.144871 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr7wh\" (UniqueName: \"kubernetes.io/projected/6d529441-c1c2-4f95-bc63-ce522b9f1b07-kube-api-access-vr7wh\") pod \"novacell0e30c-account-delete-pnqrp\" (UID: \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\") " pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.183107 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.196362 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts\") pod \"novaapi5c5d-account-delete-n94tm\" (UID: \"225d875d-9c0b-4acd-9305-b5cd82ea9056\") " pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.196472 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62496\" (UniqueName: \"kubernetes.io/projected/225d875d-9c0b-4acd-9305-b5cd82ea9056-kube-api-access-62496\") pod \"novaapi5c5d-account-delete-n94tm\" (UID: \"225d875d-9c0b-4acd-9305-b5cd82ea9056\") " pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.200148 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.200225 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data podName:5f5ea8d4-ba2b-4abe-b645-0513268421f9 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:06.200202967 +0000 UTC m=+1474.879404785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data") pod "rabbitmq-server-0" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9") : configmap "rabbitmq-config-data" not found Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.223494 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.224339 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-log" containerID="cri-o://f77ef15f63cb09956b5bfe2dd8b835072429e31cb25ce434a16fdc1b92dc9dea" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.224809 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-httpd" containerID="cri-o://f8133fd16248ab3367c5964464b825f0d076ca73089fbe5d382745f4baeb4dd1" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.248930 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="ovsdbserver-nb" containerID="cri-o://3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af" gracePeriod=300 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.254086 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.261709 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-server" containerID="cri-o://903ac30a9dbec48f735e990bea93a6806f1762703320ed6711cbac6fbff79c06" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262405 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="swift-recon-cron" containerID="cri-o://d6a3d13e69838eb9d4252d0286848176a74b5a51be636cf9579e83eb743b5064" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262480 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="rsync" containerID="cri-o://8a3c5bcf0c2ae9a26e17f253d1e64688670620c4dc1055a422aca31fb19825c3" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262540 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-expirer" containerID="cri-o://87a7e0f4bc1babfa0daa5d1e1ca5980686a23a89b61b6d2d33f6849a4eb58074" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262587 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-updater" containerID="cri-o://e400c5c5cf3eebf7e008c823d4b2464e8e01eeefe341a90c08ccbb54d5c9bc15" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262647 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-server" containerID="cri-o://8b51fec0244bcdc2b862bc7cdd69319c60e9093b1b1eee3641c5eec3c4239828" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262665 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-replicator" containerID="cri-o://2ab0a7853e519d877df1c22e5ebeb99b1c64c7a449e720446cdb4ecc46d44e12" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262722 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-reaper" containerID="cri-o://42f52ac7a2a3d8ce2846507f56c360127f1cd001844f1aa503e54d119743673b" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262768 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-auditor" containerID="cri-o://2eb02dc54fa136e7c5025e9634090a7041d9069cad341af2a5616acb5c290480" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262808 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-replicator" containerID="cri-o://87378e41d86dff6f16d62f1a53be1da15948d42fc9a54646c7077b0037089474" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262843 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-auditor" containerID="cri-o://3bfc3a40523ed141ac9ba98103ae3054008e91cd9185bc26c4c466d7974e3c46" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262918 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-replicator" containerID="cri-o://5c2fe7129b5e32d44a73bbe4f844f7bea44dd368fbaba824bda4a249bc55166e" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.262964 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-server" containerID="cri-o://f63f390a689d7edab2c7f57aaab9b53c11b9ab447c109328c91f7c6d40c5f5ec" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.263013 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-updater" containerID="cri-o://f5ee039fbe62517040d20e45c50d6d689b188548be0f48205093fbd0376ceb73" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.263091 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-auditor" containerID="cri-o://73f91d32af16b5469f8f5d9eeb6ad5b3a0299f66553fb69a26e37fd1e4724334" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.303932 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pq5lt"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.303978 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts\") pod \"novaapi5c5d-account-delete-n94tm\" (UID: \"225d875d-9c0b-4acd-9305-b5cd82ea9056\") " pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.304340 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62496\" (UniqueName: \"kubernetes.io/projected/225d875d-9c0b-4acd-9305-b5cd82ea9056-kube-api-access-62496\") pod \"novaapi5c5d-account-delete-n94tm\" (UID: \"225d875d-9c0b-4acd-9305-b5cd82ea9056\") " pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.305214 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts\") pod \"novaapi5c5d-account-delete-n94tm\" (UID: \"225d875d-9c0b-4acd-9305-b5cd82ea9056\") " pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.315272 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pq5lt"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.333896 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-58ghr"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.349925 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62496\" (UniqueName: \"kubernetes.io/projected/225d875d-9c0b-4acd-9305-b5cd82ea9056-kube-api-access-62496\") pod \"novaapi5c5d-account-delete-n94tm\" (UID: \"225d875d-9c0b-4acd-9305-b5cd82ea9056\") " pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.356011 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-58ghr"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.368190 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.387839 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-75cc5c4f79-dfmjl"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.388210 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-httpd" containerID="cri-o://3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.388305 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-server" containerID="cri-o://878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.418050 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7544499f5f-8pt52"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.418423 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-worker-7544499f5f-8pt52" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerName="barbican-worker-log" containerID="cri-o://fea23ba732bf069b37174feb4444f8fe8af7b5feb6c583a3ca3c7cf599079d2c" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.419105 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-worker-7544499f5f-8pt52" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerName="barbican-worker" containerID="cri-o://da8e534c170ae59a27fe6be7bcd6d2dc6851930207b80fde5ade7f1d41b386bc" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.480826 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.486712 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.498142 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.498430 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-log" containerID="cri-o://913da7fa54a8dadf2cb1d0f96e7c1a4c10b128d92a72d6916595429a588b3727" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.498916 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-api" containerID="cri-o://78be597c70fe60222ad7f16fc5b17c8bb5fb9f375d97a8edf8fce01c8f2a8af4" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.520838 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.521102 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-log" containerID="cri-o://6403883701f7e5dc62ff1b903104a5ab99aba9e00d1492955efcacde021d3e12" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.522015 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-metadata" containerID="cri-o://994d0d2f29569bced7df7f345325a897433ec18de7893657a1375fbcb53bacf2" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.529791 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" containerName="rabbitmq" containerID="cri-o://78c4512cda26fc1c5231bc7d98d45ac8739cbb21786dfbb0e443ec8c003043b6" gracePeriod=604800 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.543980 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.547202 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.166:8080/healthcheck\": read tcp 10.217.0.2:49636->10.217.0.166:8080: read: connection reset by peer" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.602402 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zkjwj"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.616789 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zkjwj"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.619612 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="4c5390ed-d89e-4450-adf3-1363b1150d1b" containerName="rabbitmq" containerID="cri-o://6b70899d8462595d68160840ba2f772a58ad47466f6d54dfc39939ee3313bcb9" gracePeriod=604800 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.666067 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ed9a-account-create-update-6qjv9"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.686560 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d9fcdcd8b-tl47q"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.686934 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api-log" containerID="cri-o://a4649f26c430accc7c822a3b6948eaa51389ccaecd8fe7409b8405cd4514eba9" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.687089 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api" containerID="cri-o://c58ee236e1d2b49c25a04da65637b68cbc07d2541bfa6191ac3c1b680febe5ff" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.717804 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-55d547fcd6-blkmf"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.718103 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" containerName="barbican-keystone-listener-log" containerID="cri-o://f9fa9adbbf6d1354d89225638d8e5a8383e2d9b6d0d928de2d7dc4db987750c0" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.718559 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" containerName="barbican-keystone-listener" containerID="cri-o://6c6a86eadcfcca89403d03d1232bb2037cc1a1a3bd1de3d050a23ad833ce0858" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.730044 5200 secret.go:189] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.730103 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data podName:4bdf37de-cbcd-4971-a688-f3b91cf98b47 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:06.730085239 +0000 UTC m=+1475.409287057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data") pod "glance-default-external-api-0" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47") : secret "glance-default-external-config-data" not found Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.730142 5200 secret.go:189] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.730162 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts podName:4bdf37de-cbcd-4971-a688-f3b91cf98b47 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:06.73015585 +0000 UTC m=+1475.409357668 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts") pod "glance-default-external-api-0" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47") : secret "glance-scripts" not found Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.744860 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ed9a-account-create-update-6qjv9"] Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.767738 5200 handlers.go:80] "Exec lifecycle hook for Container in Pod failed" err=< Nov 26 13:55:04 crc kubenswrapper[5200]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 26 13:55:04 crc kubenswrapper[5200]: + source /usr/local/bin/container-scripts/functions Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNBridge=br-int Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNRemote=tcp:localhost:6642 Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNEncapType=geneve Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNAvailabilityZones= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ EnableChassisAsGateway=true Nov 26 13:55:04 crc kubenswrapper[5200]: ++ PhysicalNetworks= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNHostName= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 26 13:55:04 crc kubenswrapper[5200]: ++ ovs_dir=/var/lib/openvswitch Nov 26 13:55:04 crc kubenswrapper[5200]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 26 13:55:04 crc kubenswrapper[5200]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 26 13:55:04 crc kubenswrapper[5200]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + sleep 0.5 Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + sleep 0.5 Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + cleanup_ovsdb_server_semaphore Nov 26 13:55:04 crc kubenswrapper[5200]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 26 13:55:04 crc kubenswrapper[5200]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 26 13:55:04 crc kubenswrapper[5200]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-qk25n" message=< Nov 26 13:55:04 crc kubenswrapper[5200]: Exiting ovsdb-server (5) [ OK ] Nov 26 13:55:04 crc kubenswrapper[5200]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 26 13:55:04 crc kubenswrapper[5200]: + source /usr/local/bin/container-scripts/functions Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNBridge=br-int Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNRemote=tcp:localhost:6642 Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNEncapType=geneve Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNAvailabilityZones= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ EnableChassisAsGateway=true Nov 26 13:55:04 crc kubenswrapper[5200]: ++ PhysicalNetworks= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNHostName= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 26 13:55:04 crc kubenswrapper[5200]: ++ ovs_dir=/var/lib/openvswitch Nov 26 13:55:04 crc kubenswrapper[5200]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 26 13:55:04 crc kubenswrapper[5200]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 26 13:55:04 crc kubenswrapper[5200]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + sleep 0.5 Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + sleep 0.5 Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + cleanup_ovsdb_server_semaphore Nov 26 13:55:04 crc kubenswrapper[5200]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 26 13:55:04 crc kubenswrapper[5200]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 26 13:55:04 crc kubenswrapper[5200]: > Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.767800 5200 kuberuntime_container.go:741] "PreStop hook failed" err=< Nov 26 13:55:04 crc kubenswrapper[5200]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 26 13:55:04 crc kubenswrapper[5200]: + source /usr/local/bin/container-scripts/functions Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNBridge=br-int Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNRemote=tcp:localhost:6642 Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNEncapType=geneve Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNAvailabilityZones= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ EnableChassisAsGateway=true Nov 26 13:55:04 crc kubenswrapper[5200]: ++ PhysicalNetworks= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ OVNHostName= Nov 26 13:55:04 crc kubenswrapper[5200]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 26 13:55:04 crc kubenswrapper[5200]: ++ ovs_dir=/var/lib/openvswitch Nov 26 13:55:04 crc kubenswrapper[5200]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 26 13:55:04 crc kubenswrapper[5200]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 26 13:55:04 crc kubenswrapper[5200]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + sleep 0.5 Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + sleep 0.5 Nov 26 13:55:04 crc kubenswrapper[5200]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 26 13:55:04 crc kubenswrapper[5200]: + cleanup_ovsdb_server_semaphore Nov 26 13:55:04 crc kubenswrapper[5200]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 26 13:55:04 crc kubenswrapper[5200]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 26 13:55:04 crc kubenswrapper[5200]: > pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" containerID="cri-o://f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.767857 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" containerID="cri-o://f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" gracePeriod=29 Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.811866 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.812209 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="58fe5504-07bd-4c4d-a2b4-3fbaa402649d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dae2b75b63a59901de07ca31786301f42b38647175de1bf352b0350a4ab170d9" gracePeriod=30 Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.832374 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:04 crc kubenswrapper[5200]: E1126 13:55:04.832453 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data podName:4c5390ed-d89e-4450-adf3-1363b1150d1b nodeName:}" failed. No retries permitted until 2025-11-26 13:55:06.832436725 +0000 UTC m=+1475.511638543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b") : configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.935625 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinderf459-account-delete-h8qgw"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.963382 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:55:04 crc kubenswrapper[5200]: I1126 13:55:04.964080 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="adab4e53-c668-44cf-be5d-5802a77124aa" containerName="nova-scheduler-scheduler" containerID="cri-o://76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" gracePeriod=30 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.043878 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbe3ea8-34d8-4d29-a552-f7408d9a66d5" path="/var/lib/kubelet/pods/0bbe3ea8-34d8-4d29-a552-f7408d9a66d5/volumes" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.051758 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d" path="/var/lib/kubelet/pods/1af3d0a6-007c-4012-9f5b-2efe1e0b5a8d/volumes" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.054288 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f93568e-5ff5-4a13-af0f-67dddc7615c1" path="/var/lib/kubelet/pods/3f93568e-5ff5-4a13-af0f-67dddc7615c1/volumes" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.054958 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce2de04-4f6b-49fa-9946-5ef9663e95e8" path="/var/lib/kubelet/pods/4ce2de04-4f6b-49fa-9946-5ef9663e95e8/volumes" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.055628 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5" path="/var/lib/kubelet/pods/8a96b2e0-9e84-40fd-b9cb-ef81e2e673f5/volumes" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.063600 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae" path="/var/lib/kubelet/pods/bd085e2b-49e8-4b11-ac6b-4aa8c968a2ae/volumes" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.064600 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1df6fce-e471-466b-b8fb-e21823bb3e3f" path="/var/lib/kubelet/pods/e1df6fce-e471-466b-b8fb-e21823bb3e3f/volumes" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.065336 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6" path="/var/lib/kubelet/pods/f4a8a0f1-3a2d-47ec-84a9-042b56f2d0d6/volumes" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.116399 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="937b077a-80f5-4f02-a636-68791a2cfe0f" containerName="galera" containerID="cri-o://8eb763e1a24e243fecca1c957e0644829929333388f6e4d32ab9411ef66f8cf1" gracePeriod=30 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.140920 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" containerID="cri-o://7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" gracePeriod=28 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.163727 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="8a3c5bcf0c2ae9a26e17f253d1e64688670620c4dc1055a422aca31fb19825c3" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.164341 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="87a7e0f4bc1babfa0daa5d1e1ca5980686a23a89b61b6d2d33f6849a4eb58074" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.164353 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="e400c5c5cf3eebf7e008c823d4b2464e8e01eeefe341a90c08ccbb54d5c9bc15" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.164362 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="3bfc3a40523ed141ac9ba98103ae3054008e91cd9185bc26c4c466d7974e3c46" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.164370 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="5c2fe7129b5e32d44a73bbe4f844f7bea44dd368fbaba824bda4a249bc55166e" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.164923 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="f63f390a689d7edab2c7f57aaab9b53c11b9ab447c109328c91f7c6d40c5f5ec" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165057 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="f5ee039fbe62517040d20e45c50d6d689b188548be0f48205093fbd0376ceb73" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165072 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="73f91d32af16b5469f8f5d9eeb6ad5b3a0299f66553fb69a26e37fd1e4724334" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165079 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="2ab0a7853e519d877df1c22e5ebeb99b1c64c7a449e720446cdb4ecc46d44e12" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165092 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="42f52ac7a2a3d8ce2846507f56c360127f1cd001844f1aa503e54d119743673b" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165098 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="2eb02dc54fa136e7c5025e9634090a7041d9069cad341af2a5616acb5c290480" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165389 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="87378e41d86dff6f16d62f1a53be1da15948d42fc9a54646c7077b0037089474" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165801 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"8a3c5bcf0c2ae9a26e17f253d1e64688670620c4dc1055a422aca31fb19825c3"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165958 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"87a7e0f4bc1babfa0daa5d1e1ca5980686a23a89b61b6d2d33f6849a4eb58074"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165983 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"e400c5c5cf3eebf7e008c823d4b2464e8e01eeefe341a90c08ccbb54d5c9bc15"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.165999 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"3bfc3a40523ed141ac9ba98103ae3054008e91cd9185bc26c4c466d7974e3c46"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.166016 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"5c2fe7129b5e32d44a73bbe4f844f7bea44dd368fbaba824bda4a249bc55166e"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.166032 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"f63f390a689d7edab2c7f57aaab9b53c11b9ab447c109328c91f7c6d40c5f5ec"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.166045 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"f5ee039fbe62517040d20e45c50d6d689b188548be0f48205093fbd0376ceb73"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.166060 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"73f91d32af16b5469f8f5d9eeb6ad5b3a0299f66553fb69a26e37fd1e4724334"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.166072 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"2ab0a7853e519d877df1c22e5ebeb99b1c64c7a449e720446cdb4ecc46d44e12"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.166086 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"42f52ac7a2a3d8ce2846507f56c360127f1cd001844f1aa503e54d119743673b"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.166100 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"2eb02dc54fa136e7c5025e9634090a7041d9069cad341af2a5616acb5c290480"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.166112 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"87378e41d86dff6f16d62f1a53be1da15948d42fc9a54646c7077b0037089474"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.170370 5200 generic.go:358] "Generic (PLEG): container finished" podID="b8c23725-1545-4d07-8fc3-a4dd066a685e" containerID="819c90649cfd578627699cf1864ff703ef1f85ca28a43cafd1731fbe6e832be9" exitCode=137 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.173546 5200 generic.go:358] "Generic (PLEG): container finished" podID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerID="6403883701f7e5dc62ff1b903104a5ab99aba9e00d1492955efcacde021d3e12" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.173636 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8484d8df-fa24-4e8e-85ba-c318fc6cde61","Type":"ContainerDied","Data":"6403883701f7e5dc62ff1b903104a5ab99aba9e00d1492955efcacde021d3e12"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.176417 5200 generic.go:358] "Generic (PLEG): container finished" podID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerID="fea23ba732bf069b37174feb4444f8fe8af7b5feb6c583a3ca3c7cf599079d2c" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.176481 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7544499f5f-8pt52" event={"ID":"0a85e23d-55dd-4d11-b304-aeb59804f47d","Type":"ContainerDied","Data":"fea23ba732bf069b37174feb4444f8fe8af7b5feb6c583a3ca3c7cf599079d2c"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.178667 5200 generic.go:358] "Generic (PLEG): container finished" podID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerID="50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.178722 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747455c75f-8zrqn" event={"ID":"c3f6e95d-1b68-4641-a4e5-6bd172c0662e","Type":"ContainerDied","Data":"50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.186364 5200 generic.go:358] "Generic (PLEG): container finished" podID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerID="f77ef15f63cb09956b5bfe2dd8b835072429e31cb25ce434a16fdc1b92dc9dea" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.186503 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4bdf37de-cbcd-4971-a688-f3b91cf98b47","Type":"ContainerDied","Data":"f77ef15f63cb09956b5bfe2dd8b835072429e31cb25ce434a16fdc1b92dc9dea"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.194674 5200 generic.go:358] "Generic (PLEG): container finished" podID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerID="913da7fa54a8dadf2cb1d0f96e7c1a4c10b128d92a72d6916595429a588b3727" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.194703 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0","Type":"ContainerDied","Data":"913da7fa54a8dadf2cb1d0f96e7c1a4c10b128d92a72d6916595429a588b3727"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.199098 5200 generic.go:358] "Generic (PLEG): container finished" podID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerID="a4649f26c430accc7c822a3b6948eaa51389ccaecd8fe7409b8405cd4514eba9" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.199223 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" event={"ID":"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59","Type":"ContainerDied","Data":"a4649f26c430accc7c822a3b6948eaa51389ccaecd8fe7409b8405cd4514eba9"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.213382 5200 generic.go:358] "Generic (PLEG): container finished" podID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerID="7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.213480 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ff2e56e0-4f2e-4101-b051-111cb2cbee56","Type":"ContainerDied","Data":"7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.225419 5200 generic.go:358] "Generic (PLEG): container finished" podID="5dd57521-dfcc-4303-9f37-3485c7407851" containerID="f9fa9adbbf6d1354d89225638d8e5a8383e2d9b6d0d928de2d7dc4db987750c0" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.225559 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" event={"ID":"5dd57521-dfcc-4303-9f37-3485c7407851","Type":"ContainerDied","Data":"f9fa9adbbf6d1354d89225638d8e5a8383e2d9b6d0d928de2d7dc4db987750c0"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.231672 5200 generic.go:358] "Generic (PLEG): container finished" podID="257070be-da5f-4601-b466-b9f7c8c8f153" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.231888 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qk25n" event={"ID":"257070be-da5f-4601-b466-b9f7c8c8f153","Type":"ContainerDied","Data":"f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.237887 5200 generic.go:358] "Generic (PLEG): container finished" podID="b0772c27-b753-402f-aded-0d4050bc80bd" containerID="3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9" exitCode=0 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.238270 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" event={"ID":"b0772c27-b753-402f-aded-0d4050bc80bd","Type":"ContainerDied","Data":"3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.242505 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8238a7a4-ddcb-4cc8-ac43-72042e4c4f50/ovsdbserver-nb/0.log" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.242928 5200 generic.go:358] "Generic (PLEG): container finished" podID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerID="ad8e5c8ff258934df8b031e4ca400c00da84dc8e84950692fb247eb689629ffd" exitCode=2 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.243105 5200 generic.go:358] "Generic (PLEG): container finished" podID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerID="3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.243064 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50","Type":"ContainerDied","Data":"ad8e5c8ff258934df8b031e4ca400c00da84dc8e84950692fb247eb689629ffd"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.243346 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50","Type":"ContainerDied","Data":"3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.259096 5200 generic.go:358] "Generic (PLEG): container finished" podID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerID="01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.259257 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-779774b9bd-gltc4" event={"ID":"4f6817f6-ce07-4b0f-a6c8-d667faa003a8","Type":"ContainerDied","Data":"01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.265026 5200 generic.go:358] "Generic (PLEG): container finished" podID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerID="7503a8964c681bf49d84540f8809922bb7d2d71d2f4e4f1b1afea676f29d0da7" exitCode=143 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.265962 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"046ac75d-4cfe-40ed-aca5-a5875ea641d8","Type":"ContainerDied","Data":"7503a8964c681bf49d84540f8809922bb7d2d71d2f4e4f1b1afea676f29d0da7"} Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.298946 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7bdlq"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.312131 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.312519 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="aae8bd0a-01d7-497d-b7fb-26c495823af7" containerName="nova-cell1-conductor-conductor" containerID="cri-o://26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" gracePeriod=30 Nov 26 13:55:05 crc kubenswrapper[5200]: E1126 13:55:05.323772 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.323850 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7bdlq"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.335078 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutronca07-account-delete-xb4zf"] Nov 26 13:55:05 crc kubenswrapper[5200]: E1126 13:55:05.336278 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:55:05 crc kubenswrapper[5200]: E1126 13:55:05.339385 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:55:05 crc kubenswrapper[5200]: E1126 13:55:05.339485 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="adab4e53-c668-44cf-be5d-5802a77124aa" containerName="nova-scheduler-scheduler" probeResult="unknown" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.347603 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.348120 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" containerName="nova-cell0-conductor-conductor" containerID="cri-o://30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851" gracePeriod=30 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.353543 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c42lw"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.355940 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.363664 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c42lw"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.375820 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cb616b41-7836-4e58-a88b-bb68df018d24/ovsdbserver-sb/0.log" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.375943 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.383341 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance4584-account-delete-7mnx9"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.461593 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th8hr\" (UniqueName: \"kubernetes.io/projected/cb616b41-7836-4e58-a88b-bb68df018d24-kube-api-access-th8hr\") pod \"cb616b41-7836-4e58-a88b-bb68df018d24\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.461668 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-config\") pod \"cb616b41-7836-4e58-a88b-bb68df018d24\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.461717 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-config\") pod \"4972e45b-4085-4bbb-a052-95698f889c49\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.461757 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtpxf\" (UniqueName: \"kubernetes.io/projected/4972e45b-4085-4bbb-a052-95698f889c49-kube-api-access-vtpxf\") pod \"4972e45b-4085-4bbb-a052-95698f889c49\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.461799 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdbserver-sb-tls-certs\") pod \"cb616b41-7836-4e58-a88b-bb68df018d24\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.461856 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-scripts\") pod \"cb616b41-7836-4e58-a88b-bb68df018d24\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.461907 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-nb\") pod \"4972e45b-4085-4bbb-a052-95698f889c49\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.461975 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-swift-storage-0\") pod \"4972e45b-4085-4bbb-a052-95698f889c49\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.462078 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdb-rundir\") pod \"cb616b41-7836-4e58-a88b-bb68df018d24\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.462154 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cb616b41-7836-4e58-a88b-bb68df018d24\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.462185 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-metrics-certs-tls-certs\") pod \"cb616b41-7836-4e58-a88b-bb68df018d24\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.462250 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-combined-ca-bundle\") pod \"cb616b41-7836-4e58-a88b-bb68df018d24\" (UID: \"cb616b41-7836-4e58-a88b-bb68df018d24\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.462287 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-svc\") pod \"4972e45b-4085-4bbb-a052-95698f889c49\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.462317 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-sb\") pod \"4972e45b-4085-4bbb-a052-95698f889c49\" (UID: \"4972e45b-4085-4bbb-a052-95698f889c49\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.466561 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-config" (OuterVolumeSpecName: "config") pod "cb616b41-7836-4e58-a88b-bb68df018d24" (UID: "cb616b41-7836-4e58-a88b-bb68df018d24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.466889 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "cb616b41-7836-4e58-a88b-bb68df018d24" (UID: "cb616b41-7836-4e58-a88b-bb68df018d24"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.467615 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-scripts" (OuterVolumeSpecName: "scripts") pod "cb616b41-7836-4e58-a88b-bb68df018d24" (UID: "cb616b41-7836-4e58-a88b-bb68df018d24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.490782 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb616b41-7836-4e58-a88b-bb68df018d24-kube-api-access-th8hr" (OuterVolumeSpecName: "kube-api-access-th8hr") pod "cb616b41-7836-4e58-a88b-bb68df018d24" (UID: "cb616b41-7836-4e58-a88b-bb68df018d24"). InnerVolumeSpecName "kube-api-access-th8hr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.490779 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4972e45b-4085-4bbb-a052-95698f889c49-kube-api-access-vtpxf" (OuterVolumeSpecName: "kube-api-access-vtpxf") pod "4972e45b-4085-4bbb-a052-95698f889c49" (UID: "4972e45b-4085-4bbb-a052-95698f889c49"). InnerVolumeSpecName "kube-api-access-vtpxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.507271 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8vf92_82b72f68-40bd-4a81-914a-dda4355b9fc4/openstack-network-exporter/0.log" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.507695 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.510723 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.518188 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb616b41-7836-4e58-a88b-bb68df018d24" (UID: "cb616b41-7836-4e58-a88b-bb68df018d24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.539216 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placemente6ea-account-delete-s2v6r"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.565848 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "cb616b41-7836-4e58-a88b-bb68df018d24" (UID: "cb616b41-7836-4e58-a88b-bb68df018d24"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.566444 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82b72f68-40bd-4a81-914a-dda4355b9fc4-config\") pod \"82b72f68-40bd-4a81-914a-dda4355b9fc4\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.566515 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovn-rundir\") pod \"82b72f68-40bd-4a81-914a-dda4355b9fc4\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.566636 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-metrics-certs-tls-certs\") pod \"82b72f68-40bd-4a81-914a-dda4355b9fc4\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.566673 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config\") pod \"b8c23725-1545-4d07-8fc3-a4dd066a685e\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.566733 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-combined-ca-bundle\") pod \"b8c23725-1545-4d07-8fc3-a4dd066a685e\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.566750 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-combined-ca-bundle\") pod \"82b72f68-40bd-4a81-914a-dda4355b9fc4\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.566791 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovs-rundir\") pod \"82b72f68-40bd-4a81-914a-dda4355b9fc4\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.567478 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf5tk\" (UniqueName: \"kubernetes.io/projected/82b72f68-40bd-4a81-914a-dda4355b9fc4-kube-api-access-hf5tk\") pod \"82b72f68-40bd-4a81-914a-dda4355b9fc4\" (UID: \"82b72f68-40bd-4a81-914a-dda4355b9fc4\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.567558 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config-secret\") pod \"b8c23725-1545-4d07-8fc3-a4dd066a685e\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.567612 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq87g\" (UniqueName: \"kubernetes.io/projected/b8c23725-1545-4d07-8fc3-a4dd066a685e-kube-api-access-nq87g\") pod \"b8c23725-1545-4d07-8fc3-a4dd066a685e\" (UID: \"b8c23725-1545-4d07-8fc3-a4dd066a685e\") " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.568090 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "82b72f68-40bd-4a81-914a-dda4355b9fc4" (UID: "82b72f68-40bd-4a81-914a-dda4355b9fc4"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.568532 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "82b72f68-40bd-4a81-914a-dda4355b9fc4" (UID: "82b72f68-40bd-4a81-914a-dda4355b9fc4"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.569018 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82b72f68-40bd-4a81-914a-dda4355b9fc4-config" (OuterVolumeSpecName: "config") pod "82b72f68-40bd-4a81-914a-dda4355b9fc4" (UID: "82b72f68-40bd-4a81-914a-dda4355b9fc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.569349 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82b72f68-40bd-4a81-914a-dda4355b9fc4-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.569439 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.569512 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovn-rundir\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.569569 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-th8hr\" (UniqueName: \"kubernetes.io/projected/cb616b41-7836-4e58-a88b-bb68df018d24-kube-api-access-th8hr\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.569624 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.570084 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtpxf\" (UniqueName: \"kubernetes.io/projected/4972e45b-4085-4bbb-a052-95698f889c49-kube-api-access-vtpxf\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.570156 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb616b41-7836-4e58-a88b-bb68df018d24-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.570213 5200 reconciler_common.go:299] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82b72f68-40bd-4a81-914a-dda4355b9fc4-ovs-rundir\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.570284 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.570351 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.590974 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b72f68-40bd-4a81-914a-dda4355b9fc4-kube-api-access-hf5tk" (OuterVolumeSpecName: "kube-api-access-hf5tk") pod "82b72f68-40bd-4a81-914a-dda4355b9fc4" (UID: "82b72f68-40bd-4a81-914a-dda4355b9fc4"). InnerVolumeSpecName "kube-api-access-hf5tk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.610677 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c23725-1545-4d07-8fc3-a4dd066a685e-kube-api-access-nq87g" (OuterVolumeSpecName: "kube-api-access-nq87g") pod "b8c23725-1545-4d07-8fc3-a4dd066a685e" (UID: "b8c23725-1545-4d07-8fc3-a4dd066a685e"). InnerVolumeSpecName "kube-api-access-nq87g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.626875 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4972e45b-4085-4bbb-a052-95698f889c49" (UID: "4972e45b-4085-4bbb-a052-95698f889c49"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.658534 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8c23725-1545-4d07-8fc3-a4dd066a685e" (UID: "b8c23725-1545-4d07-8fc3-a4dd066a685e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.673305 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hf5tk\" (UniqueName: \"kubernetes.io/projected/82b72f68-40bd-4a81-914a-dda4355b9fc4-kube-api-access-hf5tk\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.673355 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nq87g\" (UniqueName: \"kubernetes.io/projected/b8c23725-1545-4d07-8fc3-a4dd066a685e-kube-api-access-nq87g\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.673368 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.673384 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.675358 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.711956 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican0a47-account-delete-92fz8"] Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.727165 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-config" (OuterVolumeSpecName: "config") pod "4972e45b-4085-4bbb-a052-95698f889c49" (UID: "4972e45b-4085-4bbb-a052-95698f889c49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.752025 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4972e45b-4085-4bbb-a052-95698f889c49" (UID: "4972e45b-4085-4bbb-a052-95698f889c49"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.781061 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4972e45b-4085-4bbb-a052-95698f889c49" (UID: "4972e45b-4085-4bbb-a052-95698f889c49"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.783170 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b8c23725-1545-4d07-8fc3-a4dd066a685e" (UID: "b8c23725-1545-4d07-8fc3-a4dd066a685e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.784520 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.784559 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.784571 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.784580 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.784591 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: W1126 13:55:05.805499 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaee6c82_a418_4a1d_a364_0a0eb9073c09.slice/crio-ac3642ca604ff8048aa6fd689e3a6746e7d1963e0b4878c04453ee4daaefd0c5 WatchSource:0}: Error finding container ac3642ca604ff8048aa6fd689e3a6746e7d1963e0b4878c04453ee4daaefd0c5: Status 404 returned error can't find the container with id ac3642ca604ff8048aa6fd689e3a6746e7d1963e0b4878c04453ee4daaefd0c5 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.825531 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82b72f68-40bd-4a81-914a-dda4355b9fc4" (UID: "82b72f68-40bd-4a81-914a-dda4355b9fc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: W1126 13:55:05.830784 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8db404a7_2c7c_4a0c_b21c_570bb473b1ff.slice/crio-6777c983b2370ab4787dc430fd28bf0f7626c87a80410448c20c194b39a9e2e8 WatchSource:0}: Error finding container 6777c983b2370ab4787dc430fd28bf0f7626c87a80410448c20c194b39a9e2e8: Status 404 returned error can't find the container with id 6777c983b2370ab4787dc430fd28bf0f7626c87a80410448c20c194b39a9e2e8 Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.889047 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.941608 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "cb616b41-7836-4e58-a88b-bb68df018d24" (UID: "cb616b41-7836-4e58-a88b-bb68df018d24"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.979106 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4972e45b-4085-4bbb-a052-95698f889c49" (UID: "4972e45b-4085-4bbb-a052-95698f889c49"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.997038 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.997095 5200 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4972e45b-4085-4bbb-a052-95698f889c49-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:05 crc kubenswrapper[5200]: I1126 13:55:05.999781 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "82b72f68-40bd-4a81-914a-dda4355b9fc4" (UID: "82b72f68-40bd-4a81-914a-dda4355b9fc4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: W1126 13:55:06.027242 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fdc5a98_ccb9_4059_ba29_729ad8af9c4f.slice/crio-eec4a7b4976510279b79d832079909fbc0a15d5c25205c02a5ce70e9d3b72d24 WatchSource:0}: Error finding container eec4a7b4976510279b79d832079909fbc0a15d5c25205c02a5ce70e9d3b72d24: Status 404 returned error can't find the container with id eec4a7b4976510279b79d832079909fbc0a15d5c25205c02a5ce70e9d3b72d24 Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.053808 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b8c23725-1545-4d07-8fc3-a4dd066a685e" (UID: "b8c23725-1545-4d07-8fc3-a4dd066a685e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.078364 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af is running failed: container process not found" containerID="3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.079272 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af is running failed: container process not found" containerID="3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.079851 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af is running failed: container process not found" containerID="3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.079884 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="ovsdbserver-nb" probeResult="unknown" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.099806 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b72f68-40bd-4a81-914a-dda4355b9fc4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.099842 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8c23725-1545-4d07-8fc3-a4dd066a685e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.126203 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "cb616b41-7836-4e58-a88b-bb68df018d24" (UID: "cb616b41-7836-4e58-a88b-bb68df018d24"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.157589 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.172999 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.179266 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.179335 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="ovn-northd" probeResult="unknown" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.202294 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb616b41-7836-4e58-a88b-bb68df018d24-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.202416 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.202484 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data podName:5f5ea8d4-ba2b-4abe-b645-0513268421f9 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:10.202463779 +0000 UTC m=+1478.881665597 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data") pod "rabbitmq-server-0" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9") : configmap "rabbitmq-config-data" not found Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.296882 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8238a7a4-ddcb-4cc8-ac43-72042e4c4f50/ovsdbserver-nb/0.log" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.296981 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.297782 5200 generic.go:358] "Generic (PLEG): container finished" podID="937b077a-80f5-4f02-a636-68791a2cfe0f" containerID="8eb763e1a24e243fecca1c957e0644829929333388f6e4d32ab9411ef66f8cf1" exitCode=0 Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.297858 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"937b077a-80f5-4f02-a636-68791a2cfe0f","Type":"ContainerDied","Data":"8eb763e1a24e243fecca1c957e0644829929333388f6e4d32ab9411ef66f8cf1"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.301668 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0a47-account-delete-92fz8" event={"ID":"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f","Type":"ContainerStarted","Data":"eec4a7b4976510279b79d832079909fbc0a15d5c25205c02a5ce70e9d3b72d24"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.311862 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.311892 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f7d474f5-w7lqk" event={"ID":"4972e45b-4085-4bbb-a052-95698f889c49","Type":"ContainerDied","Data":"083b5aebf9fd043c758835208e058a965da03cd1181cea157c7cd3eaf83bb10e"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.311951 5200 scope.go:117] "RemoveContainer" containerID="e57e9a52f8cde4ed409c4daef24f8b64435e90ed6915c4ccca0b63cbaa9821a8" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.312726 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.336196 5200 generic.go:358] "Generic (PLEG): container finished" podID="58fe5504-07bd-4c4d-a2b4-3fbaa402649d" containerID="dae2b75b63a59901de07ca31786301f42b38647175de1bf352b0350a4ab170d9" exitCode=0 Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.336271 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"58fe5504-07bd-4c4d-a2b4-3fbaa402649d","Type":"ContainerDied","Data":"dae2b75b63a59901de07ca31786301f42b38647175de1bf352b0350a4ab170d9"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.349639 5200 generic.go:358] "Generic (PLEG): container finished" podID="b0772c27-b753-402f-aded-0d4050bc80bd" containerID="878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080" exitCode=0 Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.353831 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.354587 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" event={"ID":"b0772c27-b753-402f-aded-0d4050bc80bd","Type":"ContainerDied","Data":"878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.354630 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75cc5c4f79-dfmjl" event={"ID":"b0772c27-b753-402f-aded-0d4050bc80bd","Type":"ContainerDied","Data":"03070f3a8b2aa95508e55cda4d5a50a20d3d05ca5965aa8664bf38504e8b2d2b"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.361826 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placemente6ea-account-delete-s2v6r" event={"ID":"05534089-21c2-4d8c-9b9e-d21c7082e727","Type":"ContainerStarted","Data":"872e7fcf8cb0537db11950616c119948931b0102090d8fb123aa10bb5b349a44"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.375125 5200 generic.go:358] "Generic (PLEG): container finished" podID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerID="cda7517c7cfab2ef565f5c3fa5201af6cdade851c6625c180d2015174e06c39f" exitCode=0 Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.375275 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7163ac1c-74a2-4383-bb8b-21a320eb6863","Type":"ContainerDied","Data":"cda7517c7cfab2ef565f5c3fa5201af6cdade851c6625c180d2015174e06c39f"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.380181 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutronca07-account-delete-xb4zf" event={"ID":"caee6c82-a418-4a1d-a364-0a0eb9073c09","Type":"ContainerStarted","Data":"ac3642ca604ff8048aa6fd689e3a6746e7d1963e0b4878c04453ee4daaefd0c5"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.404170 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cb616b41-7836-4e58-a88b-bb68df018d24/ovsdbserver-sb/0.log" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.404292 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb616b41-7836-4e58-a88b-bb68df018d24","Type":"ContainerDied","Data":"ecbbbbf41072a038578703834cff83b26f4c2b1fa187c491036e99ce716e9272"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.405024 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417359 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-public-tls-certs\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417455 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417520 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-etc-swift\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417629 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-run-httpd\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417698 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-log-httpd\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417716 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-combined-ca-bundle\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417759 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdb-rundir\") pod \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417797 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-scripts\") pod \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417826 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-config-data\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417914 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdbserver-nb-tls-certs\") pod \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417947 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-internal-tls-certs\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.417992 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-combined-ca-bundle\") pod \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.418039 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6kkg\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-kube-api-access-z6kkg\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.418064 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5xvj\" (UniqueName: \"kubernetes.io/projected/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-kube-api-access-f5xvj\") pod \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.418117 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-config\") pod \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.418203 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-metrics-certs-tls-certs\") pod \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\" (UID: \"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.418487 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" (UID: "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.420132 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.420753 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-config" (OuterVolumeSpecName: "config") pod "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" (UID: "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.425249 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.425270 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.425280 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.425399 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.425457 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8238a7a4-ddcb-4cc8-ac43-72042e4c4f50/ovsdbserver-nb/0.log" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.425930 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.426124 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8238a7a4-ddcb-4cc8-ac43-72042e4c4f50","Type":"ContainerDied","Data":"0e841449e4eb0414395ea3facea2006a78302a8213608e2cdd3e15ba212e9e9a"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.430937 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-scripts" (OuterVolumeSpecName: "scripts") pod "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" (UID: "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.437635 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-kube-api-access-f5xvj" (OuterVolumeSpecName: "kube-api-access-f5xvj") pod "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" (UID: "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"). InnerVolumeSpecName "kube-api-access-f5xvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.437814 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-kube-api-access-z6kkg" (OuterVolumeSpecName: "kube-api-access-z6kkg") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "kube-api-access-z6kkg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.438630 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.449600 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinderf459-account-delete-h8qgw" event={"ID":"ab255459-89c5-462e-a89d-eb26ba377f79","Type":"ContainerStarted","Data":"0569a45037dddceea1861312750535ac41f06661a246674622d908d6cd002217"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.449812 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" (UID: "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.478060 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="8b51fec0244bcdc2b862bc7cdd69319c60e9093b1b1eee3641c5eec3c4239828" exitCode=0 Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.478091 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="903ac30a9dbec48f735e990bea93a6806f1762703320ed6711cbac6fbff79c06" exitCode=0 Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.478138 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"8b51fec0244bcdc2b862bc7cdd69319c60e9093b1b1eee3641c5eec3c4239828"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.478168 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"903ac30a9dbec48f735e990bea93a6806f1762703320ed6711cbac6fbff79c06"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.479668 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f7d474f5-w7lqk"] Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.484645 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.488218 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance4584-account-delete-7mnx9" event={"ID":"8db404a7-2c7c-4a0c-b21c-570bb473b1ff","Type":"ContainerStarted","Data":"6777c983b2370ab4787dc430fd28bf0f7626c87a80410448c20c194b39a9e2e8"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.489528 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.499621 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8vf92_82b72f68-40bd-4a81-914a-dda4355b9fc4/openstack-network-exporter/0.log" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.499704 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8vf92" event={"ID":"82b72f68-40bd-4a81-914a-dda4355b9fc4","Type":"ContainerDied","Data":"ec2b94313c69a6266c3798bf759b2f31e170dc51201062d579193afea0c477b8"} Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.499810 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vf92" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.501035 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" (UID: "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.528624 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.531133 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-public-tls-certs\") pod \"b0772c27-b753-402f-aded-0d4050bc80bd\" (UID: \"b0772c27-b753-402f-aded-0d4050bc80bd\") " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.533655 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58f7d474f5-w7lqk"] Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.537951 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.538171 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.538653 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6kkg\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-kube-api-access-z6kkg\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.538770 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f5xvj\" (UniqueName: \"kubernetes.io/projected/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-kube-api-access-f5xvj\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.538948 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.539049 5200 reconciler_common.go:299] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0772c27-b753-402f-aded-0d4050bc80bd-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.539556 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0772c27-b753-402f-aded-0d4050bc80bd-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: W1126 13:55:06.536601 5200 empty_dir.go:511] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b0772c27-b753-402f-aded-0d4050bc80bd/volumes/kubernetes.io~secret/public-tls-certs Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.539879 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.543055 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.580993 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.615404 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.629930 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.640586 5200 scope.go:117] "RemoveContainer" containerID="7c7d2aa287af6438dfeaebe64c1768b07474f9d5fc766f50b84d23a182c3422a" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.645873 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.655018 5200 status_manager.go:895] "Failed to get status for pod" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.684934 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8vf92"] Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.693234 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.695994 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-8vf92"] Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.749381 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.749497 5200 secret.go:189] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.749560 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data podName:4bdf37de-cbcd-4971-a688-f3b91cf98b47 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:10.749541387 +0000 UTC m=+1479.428743205 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data") pod "glance-default-external-api-0" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47") : secret "glance-default-external-config-data" not found Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.749994 5200 secret.go:189] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.750224 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts podName:4bdf37de-cbcd-4971-a688-f3b91cf98b47 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:10.750211655 +0000 UTC m=+1479.429413473 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts") pod "glance-default-external-api-0" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47") : secret "glance-scripts" not found Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.786311 5200 scope.go:117] "RemoveContainer" containerID="878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080" Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.793325 5200 handlers.go:80] "Exec lifecycle hook for Container in Pod failed" err=< Nov 26 13:55:06 crc kubenswrapper[5200]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-11-26T13:55:04Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Nov 26 13:55:06 crc kubenswrapper[5200]: /etc/init.d/functions: line 589: 411 Alarm clock "$@" Nov 26 13:55:06 crc kubenswrapper[5200]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-kp5s8" message=< Nov 26 13:55:06 crc kubenswrapper[5200]: Exiting ovn-controller (1) [FAILED] Nov 26 13:55:06 crc kubenswrapper[5200]: Killing ovn-controller (1) [ OK ] Nov 26 13:55:06 crc kubenswrapper[5200]: 2025-11-26T13:55:04Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Nov 26 13:55:06 crc kubenswrapper[5200]: /etc/init.d/functions: line 589: 411 Alarm clock "$@" Nov 26 13:55:06 crc kubenswrapper[5200]: > Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.793365 5200 kuberuntime_container.go:741] "PreStop hook failed" err=< Nov 26 13:55:06 crc kubenswrapper[5200]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-11-26T13:55:04Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Nov 26 13:55:06 crc kubenswrapper[5200]: /etc/init.d/functions: line 589: 411 Alarm clock "$@" Nov 26 13:55:06 crc kubenswrapper[5200]: > pod="openstack/ovn-controller-kp5s8" podUID="743192ea-7611-42dc-ab35-0dd584590479" containerName="ovn-controller" containerID="cri-o://7bb16d3a8f9b0cf1fc4e509932320bf940df760e4ccedec11d61d40970672efa" Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.793423 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovn-controller-kp5s8" podUID="743192ea-7611-42dc-ab35-0dd584590479" containerName="ovn-controller" containerID="cri-o://7bb16d3a8f9b0cf1fc4e509932320bf940df760e4ccedec11d61d40970672efa" gracePeriod=27 Nov 26 13:55:06 crc kubenswrapper[5200]: I1126 13:55:06.800186 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kp5s8" podUID="743192ea-7611-42dc-ab35-0dd584590479" containerName="ovn-controller" probeResult="failure" output="" Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.852581 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:06 crc kubenswrapper[5200]: E1126 13:55:06.858877 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data podName:4c5390ed-d89e-4450-adf3-1363b1150d1b nodeName:}" failed. No retries permitted until 2025-11-26 13:55:10.858832944 +0000 UTC m=+1479.538034762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b") : configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.009456 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c6bcc3-5315-43f6-9765-408aaa9e882f" path="/var/lib/kubelet/pods/34c6bcc3-5315-43f6-9765-408aaa9e882f/volumes" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.010183 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4972e45b-4085-4bbb-a052-95698f889c49" path="/var/lib/kubelet/pods/4972e45b-4085-4bbb-a052-95698f889c49/volumes" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.010789 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="721e7087-5753-4216-9598-3497ce8f92a9" path="/var/lib/kubelet/pods/721e7087-5753-4216-9598-3497ce8f92a9/volumes" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.017991 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b72f68-40bd-4a81-914a-dda4355b9fc4" path="/var/lib/kubelet/pods/82b72f68-40bd-4a81-914a-dda4355b9fc4/volumes" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.019835 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c23725-1545-4d07-8fc3-a4dd066a685e" path="/var/lib/kubelet/pods/b8c23725-1545-4d07-8fc3-a4dd066a685e/volumes" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.020461 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" path="/var/lib/kubelet/pods/cb616b41-7836-4e58-a88b-bb68df018d24/volumes" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.069234 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" (UID: "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.083375 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-config-data" (OuterVolumeSpecName: "config-data") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.131831 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.133054 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.134913 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.134979 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" probeResult="unknown" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.155319 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" (UID: "8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.163476 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.166001 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.166026 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.166036 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.171507 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.188205 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.188259 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" probeResult="unknown" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.196930 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b0772c27-b753-402f-aded-0d4050bc80bd" (UID: "b0772c27-b753-402f-aded-0d4050bc80bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.267443 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0772c27-b753-402f-aded-0d4050bc80bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.326771 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.328634 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.329621 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.329654 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="aae8bd0a-01d7-497d-b7fb-26c495823af7" containerName="nova-cell1-conductor-conductor" probeResult="unknown" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.398896 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/placement-779774b9bd-gltc4" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.147:8778/\": read tcp 10.217.0.2:58902->10.217.0.147:8778: read: connection reset by peer" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.399298 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/placement-779774b9bd-gltc4" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.147:8778/\": read tcp 10.217.0.2:58912->10.217.0.147:8778: read: connection reset by peer" Nov 26 13:55:07 crc kubenswrapper[5200]: E1126 13:55:07.432863 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab255459_89c5_462e_a89d_eb26ba377f79.slice/crio-958957b973036b5e2eae357d60b2788b353e1ed18b3527c2edf63e87c97959a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab255459_89c5_462e_a89d_eb26ba377f79.slice/crio-conmon-958957b973036b5e2eae357d60b2788b353e1ed18b3527c2edf63e87c97959a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8db404a7_2c7c_4a0c_b21c_570bb473b1ff.slice/crio-conmon-db44a245ce96292b488afb54ed4cd43056384ab28322643e447b6c397b81175f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05534089_21c2_4d8c_9b9e_d21c7082e727.slice/crio-04f3c7f81e26aea0c5d43348c1a7d828024b5a90fc0938d3e7f0adb21f560dc2.scope\": RecentStats: unable to find data in memory cache]" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.451325 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.162:8776/healthcheck\": read tcp 10.217.0.2:50794->10.217.0.162:8776: read: connection reset by peer" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.531118 5200 generic.go:358] "Generic (PLEG): container finished" podID="05534089-21c2-4d8c-9b9e-d21c7082e727" containerID="04f3c7f81e26aea0c5d43348c1a7d828024b5a90fc0938d3e7f0adb21f560dc2" exitCode=0 Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.540853 5200 generic.go:358] "Generic (PLEG): container finished" podID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerID="5bb653ca071dc9a61e5ecf99b1fe3a17c36b5dd2807279eee649a3530583f056" exitCode=0 Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.556563 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.168:9292/healthcheck\": read tcp 10.217.0.2:54112->10.217.0.168:9292: read: connection reset by peer" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.556621 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.168:9292/healthcheck\": read tcp 10.217.0.2:54122->10.217.0.168:9292: read: connection reset by peer" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.568655 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kp5s8_743192ea-7611-42dc-ab35-0dd584590479/ovn-controller/0.log" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.568723 5200 generic.go:358] "Generic (PLEG): container finished" podID="743192ea-7611-42dc-ab35-0dd584590479" containerID="7bb16d3a8f9b0cf1fc4e509932320bf940df760e4ccedec11d61d40970672efa" exitCode=143 Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.573708 5200 generic.go:358] "Generic (PLEG): container finished" podID="ab255459-89c5-462e-a89d-eb26ba377f79" containerID="958957b973036b5e2eae357d60b2788b353e1ed18b3527c2edf63e87c97959a7" exitCode=0 Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.578506 5200 generic.go:358] "Generic (PLEG): container finished" podID="8db404a7-2c7c-4a0c-b21c-570bb473b1ff" containerID="db44a245ce96292b488afb54ed4cd43056384ab28322643e447b6c397b81175f" exitCode=0 Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.633557 5200 generic.go:358] "Generic (PLEG): container finished" podID="8fdc5a98-ccb9-4059-ba29-729ad8af9c4f" containerID="30a3209f439deb9a511a8367f54f1b470b4ab50a043f4340c8e0dff3bed96a28" exitCode=0 Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.636661 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0e30c-account-delete-pnqrp"] Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.636739 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placemente6ea-account-delete-s2v6r" event={"ID":"05534089-21c2-4d8c-9b9e-d21c7082e727","Type":"ContainerDied","Data":"04f3c7f81e26aea0c5d43348c1a7d828024b5a90fc0938d3e7f0adb21f560dc2"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.636771 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7163ac1c-74a2-4383-bb8b-21a320eb6863","Type":"ContainerDied","Data":"5bb653ca071dc9a61e5ecf99b1fe3a17c36b5dd2807279eee649a3530583f056"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.636788 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7163ac1c-74a2-4383-bb8b-21a320eb6863","Type":"ContainerDied","Data":"8179fb265e4f83a60e3ca62714109aa5e9d0253757c808d971ed3165d6745dd9"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.636799 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8179fb265e4f83a60e3ca62714109aa5e9d0253757c808d971ed3165d6745dd9" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.636812 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi5c5d-account-delete-n94tm"] Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.636825 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi5c5d-account-delete-n94tm" event={"ID":"225d875d-9c0b-4acd-9305-b5cd82ea9056","Type":"ContainerStarted","Data":"7c198d7cc3b7d48e26594850911ce3af2caccafa8ecf629c34e9c3402a122711"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.645583 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kp5s8" event={"ID":"743192ea-7611-42dc-ab35-0dd584590479","Type":"ContainerDied","Data":"7bb16d3a8f9b0cf1fc4e509932320bf940df760e4ccedec11d61d40970672efa"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.645652 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kp5s8" event={"ID":"743192ea-7611-42dc-ab35-0dd584590479","Type":"ContainerDied","Data":"e0809c7e4b7929c0761155dec6ef452bd7bcc964baea60a35d2ea6007dd719c9"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.645669 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0809c7e4b7929c0761155dec6ef452bd7bcc964baea60a35d2ea6007dd719c9" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.645701 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinderf459-account-delete-h8qgw" event={"ID":"ab255459-89c5-462e-a89d-eb26ba377f79","Type":"ContainerDied","Data":"958957b973036b5e2eae357d60b2788b353e1ed18b3527c2edf63e87c97959a7"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.646475 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance4584-account-delete-7mnx9" event={"ID":"8db404a7-2c7c-4a0c-b21c-570bb473b1ff","Type":"ContainerDied","Data":"db44a245ce96292b488afb54ed4cd43056384ab28322643e447b6c397b81175f"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.646504 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"937b077a-80f5-4f02-a636-68791a2cfe0f","Type":"ContainerDied","Data":"cdc8cdf94f4ea78ae1d9c535b352b21a0dec18159d1e7a4725c1f8f6049efcb7"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.646520 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdc8cdf94f4ea78ae1d9c535b352b21a0dec18159d1e7a4725c1f8f6049efcb7" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.646531 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0a47-account-delete-92fz8" event={"ID":"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f","Type":"ContainerDied","Data":"30a3209f439deb9a511a8367f54f1b470b4ab50a043f4340c8e0dff3bed96a28"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.653290 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"58fe5504-07bd-4c4d-a2b4-3fbaa402649d","Type":"ContainerDied","Data":"fc5e90d97911d7b404dc238e2ac23acd87020306506110bb00c6631f40913e68"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.653466 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc5e90d97911d7b404dc238e2ac23acd87020306506110bb00c6631f40913e68" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.659085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e30c-account-delete-pnqrp" event={"ID":"6d529441-c1c2-4f95-bc63-ce522b9f1b07","Type":"ContainerStarted","Data":"55face2d87f37ebc312073e998bdd0f880495858b9987731454264575e8ed7de"} Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.730519 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.772755 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-75cc5c4f79-dfmjl"] Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.784859 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-75cc5c4f79-dfmjl"] Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.792245 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-nova-novncproxy-tls-certs\") pod \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.792447 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdms\" (UniqueName: \"kubernetes.io/projected/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-kube-api-access-2jdms\") pod \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.792548 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-vencrypt-tls-certs\") pod \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.792602 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-config-data\") pod \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.792643 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-combined-ca-bundle\") pod \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\" (UID: \"58fe5504-07bd-4c4d-a2b4-3fbaa402649d\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.804059 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.806203 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.169:9292/healthcheck\": read tcp 10.217.0.2:45360->10.217.0.169:9292: read: connection reset by peer" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.806383 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.169:9292/healthcheck\": read tcp 10.217.0.2:45366->10.217.0.169:9292: read: connection reset by peer" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.819534 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.855491 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-kube-api-access-2jdms" (OuterVolumeSpecName: "kube-api-access-2jdms") pod "58fe5504-07bd-4c4d-a2b4-3fbaa402649d" (UID: "58fe5504-07bd-4c4d-a2b4-3fbaa402649d"). InnerVolumeSpecName "kube-api-access-2jdms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.892914 5200 scope.go:117] "RemoveContainer" containerID="3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.893694 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfb24\" (UniqueName: \"kubernetes.io/projected/937b077a-80f5-4f02-a636-68791a2cfe0f-kube-api-access-vfb24\") pod \"937b077a-80f5-4f02-a636-68791a2cfe0f\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.893740 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-combined-ca-bundle\") pod \"7163ac1c-74a2-4383-bb8b-21a320eb6863\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.893853 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-default\") pod \"937b077a-80f5-4f02-a636-68791a2cfe0f\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.893888 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-combined-ca-bundle\") pod \"937b077a-80f5-4f02-a636-68791a2cfe0f\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.893910 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-operator-scripts\") pod \"937b077a-80f5-4f02-a636-68791a2cfe0f\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.893968 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data-custom\") pod \"7163ac1c-74a2-4383-bb8b-21a320eb6863\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.895263 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-kolla-config\") pod \"937b077a-80f5-4f02-a636-68791a2cfe0f\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.895291 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-galera-tls-certs\") pod \"937b077a-80f5-4f02-a636-68791a2cfe0f\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.895327 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-generated\") pod \"937b077a-80f5-4f02-a636-68791a2cfe0f\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.895409 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdflj\" (UniqueName: \"kubernetes.io/projected/7163ac1c-74a2-4383-bb8b-21a320eb6863-kube-api-access-qdflj\") pod \"7163ac1c-74a2-4383-bb8b-21a320eb6863\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.895481 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data\") pod \"7163ac1c-74a2-4383-bb8b-21a320eb6863\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.896167 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "937b077a-80f5-4f02-a636-68791a2cfe0f" (UID: "937b077a-80f5-4f02-a636-68791a2cfe0f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.896699 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "937b077a-80f5-4f02-a636-68791a2cfe0f" (UID: "937b077a-80f5-4f02-a636-68791a2cfe0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.896725 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "937b077a-80f5-4f02-a636-68791a2cfe0f" (UID: "937b077a-80f5-4f02-a636-68791a2cfe0f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.897288 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"937b077a-80f5-4f02-a636-68791a2cfe0f\" (UID: \"937b077a-80f5-4f02-a636-68791a2cfe0f\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.897328 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7163ac1c-74a2-4383-bb8b-21a320eb6863-etc-machine-id\") pod \"7163ac1c-74a2-4383-bb8b-21a320eb6863\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.897399 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-scripts\") pod \"7163ac1c-74a2-4383-bb8b-21a320eb6863\" (UID: \"7163ac1c-74a2-4383-bb8b-21a320eb6863\") " Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.898281 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.898366 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.898434 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/937b077a-80f5-4f02-a636-68791a2cfe0f-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.898499 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2jdms\" (UniqueName: \"kubernetes.io/projected/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-kube-api-access-2jdms\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.900009 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "937b077a-80f5-4f02-a636-68791a2cfe0f" (UID: "937b077a-80f5-4f02-a636-68791a2cfe0f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.900079 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7163ac1c-74a2-4383-bb8b-21a320eb6863-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7163ac1c-74a2-4383-bb8b-21a320eb6863" (UID: "7163ac1c-74a2-4383-bb8b-21a320eb6863"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.928020 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7163ac1c-74a2-4383-bb8b-21a320eb6863" (UID: "7163ac1c-74a2-4383-bb8b-21a320eb6863"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.929462 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-scripts" (OuterVolumeSpecName: "scripts") pod "7163ac1c-74a2-4383-bb8b-21a320eb6863" (UID: "7163ac1c-74a2-4383-bb8b-21a320eb6863"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.931882 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/937b077a-80f5-4f02-a636-68791a2cfe0f-kube-api-access-vfb24" (OuterVolumeSpecName: "kube-api-access-vfb24") pod "937b077a-80f5-4f02-a636-68791a2cfe0f" (UID: "937b077a-80f5-4f02-a636-68791a2cfe0f"). InnerVolumeSpecName "kube-api-access-vfb24". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.956237 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7163ac1c-74a2-4383-bb8b-21a320eb6863-kube-api-access-qdflj" (OuterVolumeSpecName: "kube-api-access-qdflj") pod "7163ac1c-74a2-4383-bb8b-21a320eb6863" (UID: "7163ac1c-74a2-4383-bb8b-21a320eb6863"). InnerVolumeSpecName "kube-api-access-qdflj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:07 crc kubenswrapper[5200]: I1126 13:55:07.986844 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "937b077a-80f5-4f02-a636-68791a2cfe0f" (UID: "937b077a-80f5-4f02-a636-68791a2cfe0f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.023453 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vfb24\" (UniqueName: \"kubernetes.io/projected/937b077a-80f5-4f02-a636-68791a2cfe0f-kube-api-access-vfb24\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.023489 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.023498 5200 reconciler_common.go:299] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/937b077a-80f5-4f02-a636-68791a2cfe0f-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.023507 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdflj\" (UniqueName: \"kubernetes.io/projected/7163ac1c-74a2-4383-bb8b-21a320eb6863-kube-api-access-qdflj\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.023531 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.023539 5200 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7163ac1c-74a2-4383-bb8b-21a320eb6863-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.023547 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.053727 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:60086->10.217.0.200:8775: read: connection reset by peer" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.053737 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:60074->10.217.0.200:8775: read: connection reset by peer" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.068841 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58fe5504-07bd-4c4d-a2b4-3fbaa402649d" (UID: "58fe5504-07bd-4c4d-a2b4-3fbaa402649d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.084298 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "937b077a-80f5-4f02-a636-68791a2cfe0f" (UID: "937b077a-80f5-4f02-a636-68791a2cfe0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.096289 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-config-data" (OuterVolumeSpecName: "config-data") pod "58fe5504-07bd-4c4d-a2b4-3fbaa402649d" (UID: "58fe5504-07bd-4c4d-a2b4-3fbaa402649d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.125959 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.125995 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.126009 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.195325 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.195639 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="ceilometer-central-agent" containerID="cri-o://b908cb3b6ebf4aeb47d606fa2e54c25261e51e4dbd764b3dd457699d4b48474a" gracePeriod=30 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.196916 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="proxy-httpd" containerID="cri-o://b9326640d2dd0dc2cb465cceabfeb8666c56ac7ec888f7a972c9d1baf978168e" gracePeriod=30 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.196992 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="sg-core" containerID="cri-o://c12f4bdddf37a1ef5e860426fe948bdb51ce3f0edf70735ec8c9ab110cb59d7e" gracePeriod=30 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.197032 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="ceilometer-notification-agent" containerID="cri-o://52bb01fb6b98b6d753a0b12d991a396878b553610972bbf7719db7a828b9c734" gracePeriod=30 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.222670 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.236370 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.239463 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "58fe5504-07bd-4c4d-a2b4-3fbaa402649d" (UID: "58fe5504-07bd-4c4d-a2b4-3fbaa402649d"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.239602 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.239868 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" containerName="kube-state-metrics" containerID="cri-o://956d596d413d6bc508604849be962ec96f688d90bc9bd6e3f6b5953f97e5186e" gracePeriod=30 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.292717 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "58fe5504-07bd-4c4d-a2b4-3fbaa402649d" (UID: "58fe5504-07bd-4c4d-a2b4-3fbaa402649d"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.339059 5200 reconciler_common.go:299] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.339094 5200 reconciler_common.go:299] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/58fe5504-07bd-4c4d-a2b4-3fbaa402649d-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.358704 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:57248->10.217.0.161:9311: read: connection reset by peer" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.360350 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "937b077a-80f5-4f02-a636-68791a2cfe0f" (UID: "937b077a-80f5-4f02-a636-68791a2cfe0f"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.383440 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.383675 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/memcached-0" podUID="55e57b83-77f4-446e-bc0a-c78f9094aacb" containerName="memcached" containerID="cri-o://4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2" gracePeriod=30 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.416877 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7163ac1c-74a2-4383-bb8b-21a320eb6863" (UID: "7163ac1c-74a2-4383-bb8b-21a320eb6863"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.444968 5200 reconciler_common.go:299] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/937b077a-80f5-4f02-a636-68791a2cfe0f-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.445004 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.456253 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x7qzt"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.467823 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x7qzt"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.477424 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-gsmtt"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.486776 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-gsmtt"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.505529 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystonebf07-account-delete-4m8mk"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506732 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82b72f68-40bd-4a81-914a-dda4355b9fc4" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506755 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b72f68-40bd-4a81-914a-dda4355b9fc4" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506780 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-httpd" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506793 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-httpd" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506816 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="937b077a-80f5-4f02-a636-68791a2cfe0f" containerName="mysql-bootstrap" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506824 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="937b077a-80f5-4f02-a636-68791a2cfe0f" containerName="mysql-bootstrap" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506841 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4972e45b-4085-4bbb-a052-95698f889c49" containerName="dnsmasq-dns" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506846 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4972e45b-4085-4bbb-a052-95698f889c49" containerName="dnsmasq-dns" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506857 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58fe5504-07bd-4c4d-a2b4-3fbaa402649d" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506863 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fe5504-07bd-4c4d-a2b4-3fbaa402649d" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506873 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" containerName="ovsdbserver-sb" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506879 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" containerName="ovsdbserver-sb" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506906 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerName="probe" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506911 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerName="probe" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506924 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-server" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506930 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-server" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506939 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="937b077a-80f5-4f02-a636-68791a2cfe0f" containerName="galera" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506944 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="937b077a-80f5-4f02-a636-68791a2cfe0f" containerName="galera" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506958 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506964 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506972 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506978 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506988 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="ovsdbserver-nb" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.506993 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="ovsdbserver-nb" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507010 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4972e45b-4085-4bbb-a052-95698f889c49" containerName="init" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507016 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4972e45b-4085-4bbb-a052-95698f889c49" containerName="init" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507026 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerName="cinder-scheduler" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507032 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerName="cinder-scheduler" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507189 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507201 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-httpd" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507215 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="937b077a-80f5-4f02-a636-68791a2cfe0f" containerName="galera" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507227 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="82b72f68-40bd-4a81-914a-dda4355b9fc4" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507236 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" containerName="ovsdbserver-nb" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507243 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerName="probe" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507249 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" containerName="cinder-scheduler" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507258 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" containerName="proxy-server" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507265 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="58fe5504-07bd-4c4d-a2b4-3fbaa402649d" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507274 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4972e45b-4085-4bbb-a052-95698f889c49" containerName="dnsmasq-dns" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507284 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" containerName="ovsdbserver-sb" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.507291 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb616b41-7836-4e58-a88b-bb68df018d24" containerName="openstack-network-exporter" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.514102 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-557444d9fd-t9lq7"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.514449 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/keystone-557444d9fd-t9lq7" podUID="11e2f1eb-442a-4f5c-845c-1ecf10744fd1" containerName="keystone-api" containerID="cri-o://38cdc95cb224169d9fc8e6fa2db145ec76be63b846d7793abc56e1bd23843e4c" gracePeriod=30 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.514701 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.515295 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystonebf07-account-delete-4m8mk"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.522180 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.550737 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data" (OuterVolumeSpecName: "config-data") pod "7163ac1c-74a2-4383-bb8b-21a320eb6863" (UID: "7163ac1c-74a2-4383-bb8b-21a320eb6863"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.584780 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kp5s8_743192ea-7611-42dc-ab35-0dd584590479/ovn-controller/0.log" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.584856 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kp5s8" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.612021 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jjw96"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.621690 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jjw96"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.626153 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.649548 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts\") pod \"keystonebf07-account-delete-4m8mk\" (UID: \"ab55370d-6063-4303-b300-104b73319e88\") " pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.649796 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99b4k\" (UniqueName: \"kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k\") pod \"keystonebf07-account-delete-4m8mk\" (UID: \"ab55370d-6063-4303-b300-104b73319e88\") " pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.649846 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7163ac1c-74a2-4383-bb8b-21a320eb6863-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.652239 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e30c-account-create-update-shf9k"] Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.669823 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.684430 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/novacell0e30c-account-delete-pnqrp"] Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.692196 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.693234 5200 scope.go:117] "RemoveContainer" containerID="878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.694352 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e30c-account-create-update-shf9k"] Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.710035 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080\": container with ID starting with 878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080 not found: ID does not exist" containerID="878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.710090 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080"} err="failed to get container status \"878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080\": rpc error: code = NotFound desc = could not find container \"878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080\": container with ID starting with 878d9ca330ef52e688b084638e3e8db9817bc71eb415a9688fa8eb29edf48080 not found: ID does not exist" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.710115 5200 scope.go:117] "RemoveContainer" containerID="3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9" Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.710126 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.710167 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" containerName="nova-cell0-conductor-conductor" probeResult="unknown" Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.711570 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9\": container with ID starting with 3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9 not found: ID does not exist" containerID="3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.711818 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9"} err="failed to get container status \"3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9\": rpc error: code = NotFound desc = could not find container \"3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9\": container with ID starting with 3477c8b1faa4f26d93225bffb30a150f83cb5c55c2fb511adf58b66b5ceda7d9 not found: ID does not exist" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.711843 5200 scope.go:117] "RemoveContainer" containerID="76f45a7a6ec5de6fe10745e16931f137f558e7423c3c48a60e8dae410a4c952b" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.712034 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bwjrj"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.713316 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.723560 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bwjrj"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.746000 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bf07-account-create-update-97k2v"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751132 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-logs\") pod \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751175 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-log-ovn\") pod \"743192ea-7611-42dc-ab35-0dd584590479\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751225 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-combined-ca-bundle\") pod \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751281 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btmcn\" (UniqueName: \"kubernetes.io/projected/743192ea-7611-42dc-ab35-0dd584590479-kube-api-access-btmcn\") pod \"743192ea-7611-42dc-ab35-0dd584590479\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751329 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-public-tls-certs\") pod \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751438 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-config-data\") pod \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751473 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run-ovn\") pod \"743192ea-7611-42dc-ab35-0dd584590479\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751497 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpvht\" (UniqueName: \"kubernetes.io/projected/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-kube-api-access-bpvht\") pod \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751570 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run\") pod \"743192ea-7611-42dc-ab35-0dd584590479\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751696 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-internal-tls-certs\") pod \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751790 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-combined-ca-bundle\") pod \"743192ea-7611-42dc-ab35-0dd584590479\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751843 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-ovn-controller-tls-certs\") pod \"743192ea-7611-42dc-ab35-0dd584590479\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751872 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-scripts\") pod \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\" (UID: \"4f6817f6-ce07-4b0f-a6c8-d667faa003a8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.751893 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/743192ea-7611-42dc-ab35-0dd584590479-scripts\") pod \"743192ea-7611-42dc-ab35-0dd584590479\" (UID: \"743192ea-7611-42dc-ab35-0dd584590479\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.752203 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99b4k\" (UniqueName: \"kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k\") pod \"keystonebf07-account-delete-4m8mk\" (UID: \"ab55370d-6063-4303-b300-104b73319e88\") " pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.752242 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts\") pod \"keystonebf07-account-delete-4m8mk\" (UID: \"ab55370d-6063-4303-b300-104b73319e88\") " pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.752378 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.752431 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts podName:ab55370d-6063-4303-b300-104b73319e88 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:09.252420357 +0000 UTC m=+1477.931622175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts") pod "keystonebf07-account-delete-4m8mk" (UID: "ab55370d-6063-4303-b300-104b73319e88") : configmap "openstack-scripts" not found Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.752732 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystonebf07-account-delete-4m8mk"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.752825 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run" (OuterVolumeSpecName: "var-run") pod "743192ea-7611-42dc-ab35-0dd584590479" (UID: "743192ea-7611-42dc-ab35-0dd584590479"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.752836 5200 generic.go:358] "Generic (PLEG): container finished" podID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerID="4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce" exitCode=0 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.752961 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-779774b9bd-gltc4" event={"ID":"4f6817f6-ce07-4b0f-a6c8-d667faa003a8","Type":"ContainerDied","Data":"4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.752989 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-779774b9bd-gltc4" event={"ID":"4f6817f6-ce07-4b0f-a6c8-d667faa003a8","Type":"ContainerDied","Data":"57bbfc8deb83169bbcacb2dfd5aaf7443e666a2c68daaad333137eb7c9ecc184"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.753405 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-logs" (OuterVolumeSpecName: "logs") pod "4f6817f6-ce07-4b0f-a6c8-d667faa003a8" (UID: "4f6817f6-ce07-4b0f-a6c8-d667faa003a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.753412 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bf07-account-create-update-97k2v"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.753444 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "743192ea-7611-42dc-ab35-0dd584590479" (UID: "743192ea-7611-42dc-ab35-0dd584590479"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.754991 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "743192ea-7611-42dc-ab35-0dd584590479" (UID: "743192ea-7611-42dc-ab35-0dd584590479"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.759635 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/743192ea-7611-42dc-ab35-0dd584590479-scripts" (OuterVolumeSpecName: "scripts") pod "743192ea-7611-42dc-ab35-0dd584590479" (UID: "743192ea-7611-42dc-ab35-0dd584590479"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.760519 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-779774b9bd-gltc4" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.762136 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-kube-api-access-bpvht" (OuterVolumeSpecName: "kube-api-access-bpvht") pod "4f6817f6-ce07-4b0f-a6c8-d667faa003a8" (UID: "4f6817f6-ce07-4b0f-a6c8-d667faa003a8"). InnerVolumeSpecName "kube-api-access-bpvht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.765453 5200 projected.go:194] Error preparing data for projected volume kube-api-access-99b4k for pod openstack/keystonebf07-account-delete-4m8mk: failed to fetch token: serviceaccounts "galera-openstack" not found Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.765514 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k podName:ab55370d-6063-4303-b300-104b73319e88 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:09.265496296 +0000 UTC m=+1477.944698114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-99b4k" (UniqueName: "kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k") pod "keystonebf07-account-delete-4m8mk" (UID: "ab55370d-6063-4303-b300-104b73319e88") : failed to fetch token: serviceaccounts "galera-openstack" not found Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.769329 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-c2h6h"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.773484 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-scripts" (OuterVolumeSpecName: "scripts") pod "4f6817f6-ce07-4b0f-a6c8-d667faa003a8" (UID: "4f6817f6-ce07-4b0f-a6c8-d667faa003a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.820243 5200 scope.go:117] "RemoveContainer" containerID="f5e60a029fe6708251acff52c633b481ec59ead862a791d6d3ca76ff77b39620" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.820492 5200 generic.go:358] "Generic (PLEG): container finished" podID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerID="03b29c837a967963b6848307dc062dac95cb5e7b79937c757a26948129ebd5ec" exitCode=0 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.820627 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"046ac75d-4cfe-40ed-aca5-a5875ea641d8","Type":"ContainerDied","Data":"03b29c837a967963b6848307dc062dac95cb5e7b79937c757a26948129ebd5ec"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.820653 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"046ac75d-4cfe-40ed-aca5-a5875ea641d8","Type":"ContainerDied","Data":"3d48004db4d106cd8c7135bcd815a52ade65c2d73e608a8d23d85826804fd9de"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.820665 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d48004db4d106cd8c7135bcd815a52ade65c2d73e608a8d23d85826804fd9de" Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.824450 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-99b4k operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystonebf07-account-delete-4m8mk" podUID="ab55370d-6063-4303-b300-104b73319e88" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.824770 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743192ea-7611-42dc-ab35-0dd584590479-kube-api-access-btmcn" (OuterVolumeSpecName: "kube-api-access-btmcn") pod "743192ea-7611-42dc-ab35-0dd584590479" (UID: "743192ea-7611-42dc-ab35-0dd584590479"). InnerVolumeSpecName "kube-api-access-btmcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.824815 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-c2h6h"] Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.859466 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data-custom\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.859730 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2e56e0-4f2e-4101-b051-111cb2cbee56-logs\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.859771 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92np6\" (UniqueName: \"kubernetes.io/projected/ff2e56e0-4f2e-4101-b051-111cb2cbee56-kube-api-access-92np6\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.859813 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-public-tls-certs\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.859859 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.860054 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-scripts\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.860098 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-combined-ca-bundle\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.860132 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff2e56e0-4f2e-4101-b051-111cb2cbee56-etc-machine-id\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.860165 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-internal-tls-certs\") pod \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\" (UID: \"ff2e56e0-4f2e-4101-b051-111cb2cbee56\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861006 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861019 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/743192ea-7611-42dc-ab35-0dd584590479-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861033 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861041 5200 reconciler_common.go:299] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861050 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-btmcn\" (UniqueName: \"kubernetes.io/projected/743192ea-7611-42dc-ab35-0dd584590479-kube-api-access-btmcn\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861058 5200 reconciler_common.go:299] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861070 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpvht\" (UniqueName: \"kubernetes.io/projected/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-kube-api-access-bpvht\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861080 5200 reconciler_common.go:299] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/743192ea-7611-42dc-ab35-0dd584590479-var-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.861983 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff2e56e0-4f2e-4101-b051-111cb2cbee56-logs" (OuterVolumeSpecName: "logs") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.866510 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff2e56e0-4f2e-4101-b051-111cb2cbee56-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.868231 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.871406 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.872653 5200 generic.go:358] "Generic (PLEG): container finished" podID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerID="b9326640d2dd0dc2cb465cceabfeb8666c56ac7ec888f7a972c9d1baf978168e" exitCode=0 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.872827 5200 generic.go:358] "Generic (PLEG): container finished" podID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerID="c12f4bdddf37a1ef5e860426fe948bdb51ce3f0edf70735ec8c9ab110cb59d7e" exitCode=2 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.873033 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerDied","Data":"b9326640d2dd0dc2cb465cceabfeb8666c56ac7ec888f7a972c9d1baf978168e"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.873069 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerDied","Data":"c12f4bdddf37a1ef5e860426fe948bdb51ce3f0edf70735ec8c9ab110cb59d7e"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.878826 5200 generic.go:358] "Generic (PLEG): container finished" podID="eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" containerID="956d596d413d6bc508604849be962ec96f688d90bc9bd6e3f6b5953f97e5186e" exitCode=2 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.878904 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb","Type":"ContainerDied","Data":"956d596d413d6bc508604849be962ec96f688d90bc9bd6e3f6b5953f97e5186e"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.880519 5200 generic.go:358] "Generic (PLEG): container finished" podID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerID="994d0d2f29569bced7df7f345325a897433ec18de7893657a1375fbcb53bacf2" exitCode=0 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.880719 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8484d8df-fa24-4e8e-85ba-c318fc6cde61","Type":"ContainerDied","Data":"994d0d2f29569bced7df7f345325a897433ec18de7893657a1375fbcb53bacf2"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.882447 5200 generic.go:358] "Generic (PLEG): container finished" podID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerID="f8133fd16248ab3367c5964464b825f0d076ca73089fbe5d382745f4baeb4dd1" exitCode=0 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.882645 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4bdf37de-cbcd-4971-a688-f3b91cf98b47","Type":"ContainerDied","Data":"f8133fd16248ab3367c5964464b825f0d076ca73089fbe5d382745f4baeb4dd1"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.884068 5200 generic.go:358] "Generic (PLEG): container finished" podID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerID="c58ee236e1d2b49c25a04da65637b68cbc07d2541bfa6191ac3c1b680febe5ff" exitCode=0 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.884186 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" event={"ID":"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59","Type":"ContainerDied","Data":"c58ee236e1d2b49c25a04da65637b68cbc07d2541bfa6191ac3c1b680febe5ff"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.884969 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-scripts" (OuterVolumeSpecName: "scripts") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.890337 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2e56e0-4f2e-4101-b051-111cb2cbee56-kube-api-access-92np6" (OuterVolumeSpecName: "kube-api-access-92np6") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "kube-api-access-92np6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.897251 5200 generic.go:358] "Generic (PLEG): container finished" podID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerID="6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb" exitCode=0 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.897393 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ff2e56e0-4f2e-4101-b051-111cb2cbee56","Type":"ContainerDied","Data":"6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.897430 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ff2e56e0-4f2e-4101-b051-111cb2cbee56","Type":"ContainerDied","Data":"f54de4cd84d2e6f4c6259d86e5e2a578b1d331db0f82e096ae21b7207df0c712"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.897548 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.899722 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e30c-account-delete-pnqrp" event={"ID":"6d529441-c1c2-4f95-bc63-ce522b9f1b07","Type":"ContainerStarted","Data":"4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.900041 5200 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0e30c-account-delete-pnqrp" secret="" err="secret \"galera-openstack-dockercfg-jckgj\" not found" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.903563 5200 generic.go:358] "Generic (PLEG): container finished" podID="caee6c82-a418-4a1d-a364-0a0eb9073c09" containerID="f2897af7d677ef9e566180c09815773b0b5d2d1175b9ce424e5ca281bca9c086" exitCode=0 Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.903615 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutronca07-account-delete-xb4zf" event={"ID":"caee6c82-a418-4a1d-a364-0a0eb9073c09","Type":"ContainerDied","Data":"f2897af7d677ef9e566180c09815773b0b5d2d1175b9ce424e5ca281bca9c086"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.905151 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi5c5d-account-delete-n94tm" event={"ID":"225d875d-9c0b-4acd-9305-b5cd82ea9056","Type":"ContainerStarted","Data":"7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718"} Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.905295 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kp5s8" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.905410 5200 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi5c5d-account-delete-n94tm" secret="" err="secret \"galera-openstack-dockercfg-jckgj\" not found" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.905988 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.906218 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.906384 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.961966 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-logs\") pod \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.962372 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-scripts\") pod \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.962474 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-combined-ca-bundle\") pod \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.962617 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-internal-tls-certs\") pod \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.962844 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-config-data\") pod \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.970904 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhl87\" (UniqueName: \"kubernetes.io/projected/046ac75d-4cfe-40ed-aca5-a5875ea641d8-kube-api-access-jhl87\") pod \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.971467 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.971622 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-httpd-run\") pod \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\" (UID: \"046ac75d-4cfe-40ed-aca5-a5875ea641d8\") " Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.973135 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-logs" (OuterVolumeSpecName: "logs") pod "046ac75d-4cfe-40ed-aca5-a5875ea641d8" (UID: "046ac75d-4cfe-40ed-aca5-a5875ea641d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.985459 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2e56e0-4f2e-4101-b051-111cb2cbee56-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.985491 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92np6\" (UniqueName: \"kubernetes.io/projected/ff2e56e0-4f2e-4101-b051-111cb2cbee56-kube-api-access-92np6\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.985508 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.985519 5200 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff2e56e0-4f2e-4101-b051-111cb2cbee56-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.985532 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.985545 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.985608 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.985674 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts podName:225d875d-9c0b-4acd-9305-b5cd82ea9056 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:09.485655489 +0000 UTC m=+1478.164857307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts") pod "novaapi5c5d-account-delete-n94tm" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056") : configmap "openstack-scripts" not found Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.986006 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:08 crc kubenswrapper[5200]: E1126 13:55:08.986043 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts podName:6d529441-c1c2-4f95-bc63-ce522b9f1b07 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:09.486032979 +0000 UTC m=+1478.165234797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts") pod "novacell0e30c-account-delete-pnqrp" (UID: "6d529441-c1c2-4f95-bc63-ce522b9f1b07") : configmap "openstack-scripts" not found Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.995829 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "046ac75d-4cfe-40ed-aca5-a5875ea641d8" (UID: "046ac75d-4cfe-40ed-aca5-a5875ea641d8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:08 crc kubenswrapper[5200]: I1126 13:55:08.996124 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-scripts" (OuterVolumeSpecName: "scripts") pod "046ac75d-4cfe-40ed-aca5-a5875ea641d8" (UID: "046ac75d-4cfe-40ed-aca5-a5875ea641d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.056563 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046ac75d-4cfe-40ed-aca5-a5875ea641d8-kube-api-access-jhl87" (OuterVolumeSpecName: "kube-api-access-jhl87") pod "046ac75d-4cfe-40ed-aca5-a5875ea641d8" (UID: "046ac75d-4cfe-40ed-aca5-a5875ea641d8"). InnerVolumeSpecName "kube-api-access-jhl87". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.056610 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="27a62b3a-c69a-47fa-9642-425931d81827" containerName="galera" containerID="cri-o://9d12f78cd05b747844b762871d784048deb6a46596cfe397348c7103574c5d30" gracePeriod=30 Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.075216 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efdd422-f937-4662-a6f4-3b0e6b0232dd" path="/var/lib/kubelet/pods/1efdd422-f937-4662-a6f4-3b0e6b0232dd/volumes" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.077930 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7173d1b4-2e90-4382-865d-4736322f38fa" path="/var/lib/kubelet/pods/7173d1b4-2e90-4382-865d-4736322f38fa/volumes" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.078619 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6b8c32-f5d6-4f06-932a-81c158db7a8c" path="/var/lib/kubelet/pods/8e6b8c32-f5d6-4f06-932a-81c158db7a8c/volumes" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.079348 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0772c27-b753-402f-aded-0d4050bc80bd" path="/var/lib/kubelet/pods/b0772c27-b753-402f-aded-0d4050bc80bd/volumes" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.080847 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c642182e-867b-4579-9289-0347c94a1e0a" path="/var/lib/kubelet/pods/c642182e-867b-4579-9289-0347c94a1e0a/volumes" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.081378 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9efb7a-f26f-4635-9b6c-95a008eb05e0" path="/var/lib/kubelet/pods/cb9efb7a-f26f-4635-9b6c-95a008eb05e0/volumes" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.081952 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e804f8-39b6-4544-8117-3bb9c337973b" path="/var/lib/kubelet/pods/d9e804f8-39b6-4544-8117-3bb9c337973b/volumes" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.082887 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db49787b-26a2-4daf-a2fc-040890054555" path="/var/lib/kubelet/pods/db49787b-26a2-4daf-a2fc-040890054555/volumes" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.088160 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhl87\" (UniqueName: \"kubernetes.io/projected/046ac75d-4cfe-40ed-aca5-a5875ea641d8-kube-api-access-jhl87\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.088202 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/046ac75d-4cfe-40ed-aca5-a5875ea641d8-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.088216 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.097183 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": read tcp 10.217.0.2:55138->10.217.0.198:8774: read: connection reset by peer" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.097228 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": read tcp 10.217.0.2:55152->10.217.0.198:8774: read: connection reset by peer" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.099655 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "046ac75d-4cfe-40ed-aca5-a5875ea641d8" (UID: "046ac75d-4cfe-40ed-aca5-a5875ea641d8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.100119 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell0e30c-account-delete-pnqrp" podStartSLOduration=6.100102389 podStartE2EDuration="6.100102389s" podCreationTimestamp="2025-11-26 13:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:55:08.996273714 +0000 UTC m=+1477.675475552" watchObservedRunningTime="2025-11-26 13:55:09.100102389 +0000 UTC m=+1477.779304207" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.107345 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi5c5d-account-delete-n94tm" podStartSLOduration=6.107331017 podStartE2EDuration="6.107331017s" podCreationTimestamp="2025-11-26 13:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 13:55:09.036461937 +0000 UTC m=+1477.715663755" watchObservedRunningTime="2025-11-26 13:55:09.107331017 +0000 UTC m=+1477.786532835" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.116562 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-config-data" (OuterVolumeSpecName: "config-data") pod "4f6817f6-ce07-4b0f-a6c8-d667faa003a8" (UID: "4f6817f6-ce07-4b0f-a6c8-d667faa003a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.190115 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.190165 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.195368 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "743192ea-7611-42dc-ab35-0dd584590479" (UID: "743192ea-7611-42dc-ab35-0dd584590479"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.214387 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.221443 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "046ac75d-4cfe-40ed-aca5-a5875ea641d8" (UID: "046ac75d-4cfe-40ed-aca5-a5875ea641d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.237747 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.293659 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99b4k\" (UniqueName: \"kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k\") pod \"keystonebf07-account-delete-4m8mk\" (UID: \"ab55370d-6063-4303-b300-104b73319e88\") " pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.293728 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts\") pod \"keystonebf07-account-delete-4m8mk\" (UID: \"ab55370d-6063-4303-b300-104b73319e88\") " pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.293929 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.293942 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.293951 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.293960 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: E1126 13:55:09.294024 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:09 crc kubenswrapper[5200]: E1126 13:55:09.294081 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts podName:ab55370d-6063-4303-b300-104b73319e88 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:10.294064573 +0000 UTC m=+1478.973266391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts") pod "keystonebf07-account-delete-4m8mk" (UID: "ab55370d-6063-4303-b300-104b73319e88") : configmap "openstack-scripts" not found Nov 26 13:55:09 crc kubenswrapper[5200]: E1126 13:55:09.302661 5200 projected.go:194] Error preparing data for projected volume kube-api-access-99b4k for pod openstack/keystonebf07-account-delete-4m8mk: failed to fetch token: serviceaccounts "galera-openstack" not found Nov 26 13:55:09 crc kubenswrapper[5200]: E1126 13:55:09.302761 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k podName:ab55370d-6063-4303-b300-104b73319e88 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:10.302740148 +0000 UTC m=+1478.981941966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-99b4k" (UniqueName: "kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k") pod "keystonebf07-account-delete-4m8mk" (UID: "ab55370d-6063-4303-b300-104b73319e88") : failed to fetch token: serviceaccounts "galera-openstack" not found Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.308898 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data" (OuterVolumeSpecName: "config-data") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.309048 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "743192ea-7611-42dc-ab35-0dd584590479" (UID: "743192ea-7611-42dc-ab35-0dd584590479"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.310881 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "046ac75d-4cfe-40ed-aca5-a5875ea641d8" (UID: "046ac75d-4cfe-40ed-aca5-a5875ea641d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.324383 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.330817 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-config-data" (OuterVolumeSpecName: "config-data") pod "046ac75d-4cfe-40ed-aca5-a5875ea641d8" (UID: "046ac75d-4cfe-40ed-aca5-a5875ea641d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.351081 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff2e56e0-4f2e-4101-b051-111cb2cbee56" (UID: "ff2e56e0-4f2e-4101-b051-111cb2cbee56"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.351753 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4f6817f6-ce07-4b0f-a6c8-d667faa003a8" (UID: "4f6817f6-ce07-4b0f-a6c8-d667faa003a8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.368653 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4f6817f6-ce07-4b0f-a6c8-d667faa003a8" (UID: "4f6817f6-ce07-4b0f-a6c8-d667faa003a8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.371609 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f6817f6-ce07-4b0f-a6c8-d667faa003a8" (UID: "4f6817f6-ce07-4b0f-a6c8-d667faa003a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395837 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/743192ea-7611-42dc-ab35-0dd584590479-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395879 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395892 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395903 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395941 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/046ac75d-4cfe-40ed-aca5-a5875ea641d8-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395950 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395958 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2e56e0-4f2e-4101-b051-111cb2cbee56-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395967 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6817f6-ce07-4b0f-a6c8-d667faa003a8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.395975 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.429736 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/novaapi5c5d-account-delete-n94tm"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.429816 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5c5d-account-create-update-r2hfq"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.429832 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5c5d-account-create-update-r2hfq"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.452525 5200 scope.go:117] "RemoveContainer" containerID="ad8e5c8ff258934df8b031e4ca400c00da84dc8e84950692fb247eb689629ffd" Nov 26 13:55:09 crc kubenswrapper[5200]: E1126 13:55:09.500267 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:09 crc kubenswrapper[5200]: E1126 13:55:09.500702 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts podName:225d875d-9c0b-4acd-9305-b5cd82ea9056 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:10.500671445 +0000 UTC m=+1479.179873263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts") pod "novaapi5c5d-account-delete-n94tm" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056") : configmap "openstack-scripts" not found Nov 26 13:55:09 crc kubenswrapper[5200]: E1126 13:55:09.500401 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:09 crc kubenswrapper[5200]: E1126 13:55:09.500737 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts podName:6d529441-c1c2-4f95-bc63-ce522b9f1b07 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:10.500731226 +0000 UTC m=+1479.179933044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts") pod "novacell0e30c-account-delete-pnqrp" (UID: "6d529441-c1c2-4f95-bc63-ce522b9f1b07") : configmap "openstack-scripts" not found Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.528988 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.540672 5200 scope.go:117] "RemoveContainer" containerID="3a16c7394c0c1795c169ec71ab7e6a4bb501e39538c28481c5e4c71f4c4d71af" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.600922 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.601590 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4bdf37de-cbcd-4971-a688-f3b91cf98b47" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.601292 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-httpd-run\") pod \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.602824 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-public-tls-certs\") pod \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.603024 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data\") pod \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.603855 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-combined-ca-bundle\") pod \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.604276 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-logs\") pod \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.604421 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj949\" (UniqueName: \"kubernetes.io/projected/4bdf37de-cbcd-4971-a688-f3b91cf98b47-kube-api-access-hj949\") pod \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.604715 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts\") pod \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\" (UID: \"4bdf37de-cbcd-4971-a688-f3b91cf98b47\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.605613 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-logs" (OuterVolumeSpecName: "logs") pod "4bdf37de-cbcd-4971-a688-f3b91cf98b47" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.606674 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.606901 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf37de-cbcd-4971-a688-f3b91cf98b47-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.612089 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "4bdf37de-cbcd-4971-a688-f3b91cf98b47" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.612203 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts" (OuterVolumeSpecName: "scripts") pod "4bdf37de-cbcd-4971-a688-f3b91cf98b47" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.623889 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdf37de-cbcd-4971-a688-f3b91cf98b47-kube-api-access-hj949" (OuterVolumeSpecName: "kube-api-access-hj949") pod "4bdf37de-cbcd-4971-a688-f3b91cf98b47" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47"). InnerVolumeSpecName "kube-api-access-hj949". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.661129 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.678958 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bdf37de-cbcd-4971-a688-f3b91cf98b47" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.681464 5200 scope.go:117] "RemoveContainer" containerID="819c90649cfd578627699cf1864ff703ef1f85ca28a43cafd1731fbe6e832be9" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.682497 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.682755 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.706722 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.712457 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.713759 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.713772 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hj949\" (UniqueName: \"kubernetes.io/projected/4bdf37de-cbcd-4971-a688-f3b91cf98b47-kube-api-access-hj949\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.713781 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.713802 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.721324 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4bdf37de-cbcd-4971-a688-f3b91cf98b47" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.727591 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.734669 5200 scope.go:117] "RemoveContainer" containerID="bf61bc7bfe41e1213dbebfb173eaf919afb90cb614da617b3eb37e70e0d8b683" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.755994 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.759673 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.763701 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.787093 5200 scope.go:117] "RemoveContainer" containerID="4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.788811 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.805146 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.810624 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.810821 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data" (OuterVolumeSpecName: "config-data") pod "4bdf37de-cbcd-4971-a688-f3b91cf98b47" (UID: "4bdf37de-cbcd-4971-a688-f3b91cf98b47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815236 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-config-data\") pod \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815275 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-config\") pod \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815338 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-combined-ca-bundle\") pod \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815369 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjxt5\" (UniqueName: \"kubernetes.io/projected/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-kube-api-access-vjxt5\") pod \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815395 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-combined-ca-bundle\") pod \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815487 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8484d8df-fa24-4e8e-85ba-c318fc6cde61-logs\") pod \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815511 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4txq4\" (UniqueName: \"kubernetes.io/projected/8484d8df-fa24-4e8e-85ba-c318fc6cde61-kube-api-access-4txq4\") pod \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815574 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-public-tls-certs\") pod \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815606 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-combined-ca-bundle\") pod \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815632 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data-custom\") pod \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815709 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-internal-tls-certs\") pod \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815739 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-certs\") pod \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815767 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data\") pod \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815814 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-logs\") pod \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\" (UID: \"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815857 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-nova-metadata-tls-certs\") pod \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\" (UID: \"8484d8df-fa24-4e8e-85ba-c318fc6cde61\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.815875 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rxh4\" (UniqueName: \"kubernetes.io/projected/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-api-access-2rxh4\") pod \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\" (UID: \"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.816301 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.816320 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.816330 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf37de-cbcd-4971-a688-f3b91cf98b47-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.817934 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-logs" (OuterVolumeSpecName: "logs") pod "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" (UID: "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.822010 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8484d8df-fa24-4e8e-85ba-c318fc6cde61-logs" (OuterVolumeSpecName: "logs") pod "8484d8df-fa24-4e8e-85ba-c318fc6cde61" (UID: "8484d8df-fa24-4e8e-85ba-c318fc6cde61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.841350 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8484d8df-fa24-4e8e-85ba-c318fc6cde61-kube-api-access-4txq4" (OuterVolumeSpecName: "kube-api-access-4txq4") pod "8484d8df-fa24-4e8e-85ba-c318fc6cde61" (UID: "8484d8df-fa24-4e8e-85ba-c318fc6cde61"). InnerVolumeSpecName "kube-api-access-4txq4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.841399 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" (UID: "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.844430 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-api-access-2rxh4" (OuterVolumeSpecName: "kube-api-access-2rxh4") pod "eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" (UID: "eab77ac1-a779-46ae-b0f6-7d39a99d1ecb"). InnerVolumeSpecName "kube-api-access-2rxh4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.858894 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-kube-api-access-vjxt5" (OuterVolumeSpecName: "kube-api-access-vjxt5") pod "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" (UID: "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59"). InnerVolumeSpecName "kube-api-access-vjxt5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.874952 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.876418 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" (UID: "eab77ac1-a779-46ae-b0f6-7d39a99d1ecb"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.883210 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" (UID: "eab77ac1-a779-46ae-b0f6-7d39a99d1ecb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.885189 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8484d8df-fa24-4e8e-85ba-c318fc6cde61" (UID: "8484d8df-fa24-4e8e-85ba-c318fc6cde61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.886087 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-config-data" (OuterVolumeSpecName: "config-data") pod "8484d8df-fa24-4e8e-85ba-c318fc6cde61" (UID: "8484d8df-fa24-4e8e-85ba-c318fc6cde61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.899097 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kp5s8"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.905336 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data" (OuterVolumeSpecName: "config-data") pod "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" (UID: "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.908860 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" (UID: "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.922837 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kp5s8"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.924141 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb7xk\" (UniqueName: \"kubernetes.io/projected/ab255459-89c5-462e-a89d-eb26ba377f79-kube-api-access-hb7xk\") pod \"ab255459-89c5-462e-a89d-eb26ba377f79\" (UID: \"ab255459-89c5-462e-a89d-eb26ba377f79\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.924313 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05534089-21c2-4d8c-9b9e-d21c7082e727-operator-scripts\") pod \"05534089-21c2-4d8c-9b9e-d21c7082e727\" (UID: \"05534089-21c2-4d8c-9b9e-d21c7082e727\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.924469 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsz2q\" (UniqueName: \"kubernetes.io/projected/05534089-21c2-4d8c-9b9e-d21c7082e727-kube-api-access-bsz2q\") pod \"05534089-21c2-4d8c-9b9e-d21c7082e727\" (UID: \"05534089-21c2-4d8c-9b9e-d21c7082e727\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.924621 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab255459-89c5-462e-a89d-eb26ba377f79-operator-scripts\") pod \"ab255459-89c5-462e-a89d-eb26ba377f79\" (UID: \"ab255459-89c5-462e-a89d-eb26ba377f79\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.924858 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-combined-ca-bundle\") pod \"55e57b83-77f4-446e-bc0a-c78f9094aacb\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.925056 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-kolla-config\") pod \"55e57b83-77f4-446e-bc0a-c78f9094aacb\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.925181 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-memcached-tls-certs\") pod \"55e57b83-77f4-446e-bc0a-c78f9094aacb\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.925291 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-config-data\") pod \"55e57b83-77f4-446e-bc0a-c78f9094aacb\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.925483 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvhqs\" (UniqueName: \"kubernetes.io/projected/55e57b83-77f4-446e-bc0a-c78f9094aacb-kube-api-access-jvhqs\") pod \"55e57b83-77f4-446e-bc0a-c78f9094aacb\" (UID: \"55e57b83-77f4-446e-bc0a-c78f9094aacb\") " Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926137 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8484d8df-fa24-4e8e-85ba-c318fc6cde61-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926246 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4txq4\" (UniqueName: \"kubernetes.io/projected/8484d8df-fa24-4e8e-85ba-c318fc6cde61-kube-api-access-4txq4\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926341 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926420 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926641 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926748 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926833 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2rxh4\" (UniqueName: \"kubernetes.io/projected/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-api-access-2rxh4\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926909 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.926993 5200 reconciler_common.go:299] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.927067 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.927144 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vjxt5\" (UniqueName: \"kubernetes.io/projected/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-kube-api-access-vjxt5\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.927220 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.929108 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05534089-21c2-4d8c-9b9e-d21c7082e727-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05534089-21c2-4d8c-9b9e-d21c7082e727" (UID: "05534089-21c2-4d8c-9b9e-d21c7082e727"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.933731 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.934481 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" (UID: "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.935078 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab255459-89c5-462e-a89d-eb26ba377f79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab255459-89c5-462e-a89d-eb26ba377f79" (UID: "ab255459-89c5-462e-a89d-eb26ba377f79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.935573 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-config-data" (OuterVolumeSpecName: "config-data") pod "55e57b83-77f4-446e-bc0a-c78f9094aacb" (UID: "55e57b83-77f4-446e-bc0a-c78f9094aacb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.935620 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "55e57b83-77f4-446e-bc0a-c78f9094aacb" (UID: "55e57b83-77f4-446e-bc0a-c78f9094aacb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.938841 5200 generic.go:358] "Generic (PLEG): container finished" podID="55e57b83-77f4-446e-bc0a-c78f9094aacb" containerID="4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2" exitCode=0 Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.938947 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"55e57b83-77f4-446e-bc0a-c78f9094aacb","Type":"ContainerDied","Data":"4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2"} Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.939948 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"55e57b83-77f4-446e-bc0a-c78f9094aacb","Type":"ContainerDied","Data":"bc3f8dee613dcc94578135d14352848e76bf8c58cc711537db2bb71c55a99bc0"} Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.938966 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.946628 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05534089-21c2-4d8c-9b9e-d21c7082e727-kube-api-access-bsz2q" (OuterVolumeSpecName: "kube-api-access-bsz2q") pod "05534089-21c2-4d8c-9b9e-d21c7082e727" (UID: "05534089-21c2-4d8c-9b9e-d21c7082e727"). InnerVolumeSpecName "kube-api-access-bsz2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.951438 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8484d8df-fa24-4e8e-85ba-c318fc6cde61" (UID: "8484d8df-fa24-4e8e-85ba-c318fc6cde61"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.952269 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab255459-89c5-462e-a89d-eb26ba377f79-kube-api-access-hb7xk" (OuterVolumeSpecName: "kube-api-access-hb7xk") pod "ab255459-89c5-462e-a89d-eb26ba377f79" (UID: "ab255459-89c5-462e-a89d-eb26ba377f79"). InnerVolumeSpecName "kube-api-access-hb7xk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.961416 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e57b83-77f4-446e-bc0a-c78f9094aacb-kube-api-access-jvhqs" (OuterVolumeSpecName: "kube-api-access-jvhqs") pod "55e57b83-77f4-446e-bc0a-c78f9094aacb" (UID: "55e57b83-77f4-446e-bc0a-c78f9094aacb"). InnerVolumeSpecName "kube-api-access-jvhqs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.967373 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" (UID: "eab77ac1-a779-46ae-b0f6-7d39a99d1ecb"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.969232 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.970795 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placemente6ea-account-delete-s2v6r" event={"ID":"05534089-21c2-4d8c-9b9e-d21c7082e727","Type":"ContainerDied","Data":"872e7fcf8cb0537db11950616c119948931b0102090d8fb123aa10bb5b349a44"} Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.970827 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="872e7fcf8cb0537db11950616c119948931b0102090d8fb123aa10bb5b349a44" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.971143 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placemente6ea-account-delete-s2v6r" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.981927 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55e57b83-77f4-446e-bc0a-c78f9094aacb" (UID: "55e57b83-77f4-446e-bc0a-c78f9094aacb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.986322 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" (UID: "789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.993644 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderf459-account-delete-h8qgw" Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.994167 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinderf459-account-delete-h8qgw" event={"ID":"ab255459-89c5-462e-a89d-eb26ba377f79","Type":"ContainerDied","Data":"0569a45037dddceea1861312750535ac41f06661a246674622d908d6cd002217"} Nov 26 13:55:09 crc kubenswrapper[5200]: I1126 13:55:09.994217 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0569a45037dddceea1861312750535ac41f06661a246674622d908d6cd002217" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.001529 5200 generic.go:358] "Generic (PLEG): container finished" podID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerID="b908cb3b6ebf4aeb47d606fa2e54c25261e51e4dbd764b3dd457699d4b48474a" exitCode=0 Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.001622 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerDied","Data":"b908cb3b6ebf4aeb47d606fa2e54c25261e51e4dbd764b3dd457699d4b48474a"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.005998 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-779774b9bd-gltc4"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.007922 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance4584-account-delete-7mnx9" event={"ID":"8db404a7-2c7c-4a0c-b21c-570bb473b1ff","Type":"ContainerDied","Data":"6777c983b2370ab4787dc430fd28bf0f7626c87a80410448c20c194b39a9e2e8"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.010077 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6777c983b2370ab4787dc430fd28bf0f7626c87a80410448c20c194b39a9e2e8" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.011276 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eab77ac1-a779-46ae-b0f6-7d39a99d1ecb","Type":"ContainerDied","Data":"c3b964ab1121c55e197e9b9c929c6fd48c8f726021af4a3f167064718031b820"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.011514 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.021025 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8484d8df-fa24-4e8e-85ba-c318fc6cde61","Type":"ContainerDied","Data":"138a46eb895f388cbd9ac7e54d166edebac46d5e12c56217fc148160c245f104"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.021123 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.028982 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0a47-account-delete-92fz8" event={"ID":"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f","Type":"ContainerDied","Data":"eec4a7b4976510279b79d832079909fbc0a15d5c25205c02a5ce70e9d3b72d24"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.029020 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eec4a7b4976510279b79d832079909fbc0a15d5c25205c02a5ce70e9d3b72d24" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.029951 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvhqs\" (UniqueName: \"kubernetes.io/projected/55e57b83-77f4-446e-bc0a-c78f9094aacb-kube-api-access-jvhqs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.029966 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.029976 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hb7xk\" (UniqueName: \"kubernetes.io/projected/ab255459-89c5-462e-a89d-eb26ba377f79-kube-api-access-hb7xk\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.029984 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05534089-21c2-4d8c-9b9e-d21c7082e727-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.029992 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.030001 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsz2q\" (UniqueName: \"kubernetes.io/projected/05534089-21c2-4d8c-9b9e-d21c7082e727-kube-api-access-bsz2q\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.030010 5200 reconciler_common.go:299] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.030019 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab255459-89c5-462e-a89d-eb26ba377f79-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.030028 5200 reconciler_common.go:299] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8484d8df-fa24-4e8e-85ba-c318fc6cde61-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.030036 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.030044 5200 reconciler_common.go:299] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.030053 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/55e57b83-77f4-446e-bc0a-c78f9094aacb-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.031788 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4bdf37de-cbcd-4971-a688-f3b91cf98b47","Type":"ContainerDied","Data":"ee433c892a72ac2b116b848168af197a5ddf19a92a364f003343a68e3b2aab04"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.031872 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.036886 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-779774b9bd-gltc4"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.042125 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "55e57b83-77f4-446e-bc0a-c78f9094aacb" (UID: "55e57b83-77f4-446e-bc0a-c78f9094aacb"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.069297 5200 generic.go:358] "Generic (PLEG): container finished" podID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerID="78be597c70fe60222ad7f16fc5b17c8bb5fb9f375d97a8edf8fce01c8f2a8af4" exitCode=0 Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.069609 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0","Type":"ContainerDied","Data":"78be597c70fe60222ad7f16fc5b17c8bb5fb9f375d97a8edf8fce01c8f2a8af4"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.069663 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0","Type":"ContainerDied","Data":"d7feaa0e0a943cd5ff1049b449daef3bbd78f5a0c53dcbd0016828ef49824f9b"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.069723 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7feaa0e0a943cd5ff1049b449daef3bbd78f5a0c53dcbd0016828ef49824f9b" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.074449 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.074452 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" event={"ID":"789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59","Type":"ContainerDied","Data":"80e3e11a46811afe460b1bc7a1b17a7c419c8b0f99bd99bac53fc0d0212ab3ce"} Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.074468 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.074455 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.075986 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/novacell0e30c-account-delete-pnqrp" podUID="6d529441-c1c2-4f95-bc63-ce522b9f1b07" containerName="mariadb-account-delete" containerID="cri-o://4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599" gracePeriod=30 Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.111117 5200 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi5c5d-account-delete-n94tm" secret="" err="secret \"galera-openstack-dockercfg-jckgj\" not found" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.139400 5200 reconciler_common.go:299] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/55e57b83-77f4-446e-bc0a-c78f9094aacb-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.143321 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.165665 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.199234 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.206740 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.210241 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.216152 5200 scope.go:117] "RemoveContainer" containerID="01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.226885 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.237990 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.241078 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-operator-scripts\") pod \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\" (UID: \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.241146 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj2g5\" (UniqueName: \"kubernetes.io/projected/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-kube-api-access-lj2g5\") pod \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\" (UID: \"8fdc5a98-ccb9-4059-ba29-729ad8af9c4f\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.241730 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fdc5a98-ccb9-4059-ba29-729ad8af9c4f" (UID: "8fdc5a98-ccb9-4059-ba29-729ad8af9c4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.241906 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.241968 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.242126 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data podName:5f5ea8d4-ba2b-4abe-b645-0513268421f9 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:18.242106337 +0000 UTC m=+1486.921308155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data") pod "rabbitmq-server-0" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9") : configmap "rabbitmq-config-data" not found Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.248770 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-kube-api-access-lj2g5" (OuterVolumeSpecName: "kube-api-access-lj2g5") pod "8fdc5a98-ccb9-4059-ba29-729ad8af9c4f" (UID: "8fdc5a98-ccb9-4059-ba29-729ad8af9c4f"). InnerVolumeSpecName "kube-api-access-lj2g5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.253276 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.263346 5200 scope.go:117] "RemoveContainer" containerID="4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.268495 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce\": container with ID starting with 4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce not found: ID does not exist" containerID="4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.268533 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce"} err="failed to get container status \"4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce\": rpc error: code = NotFound desc = could not find container \"4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce\": container with ID starting with 4a287dce5beabba9c434adaae4a258c28c31cfff2e53ca0e95ca7c6cc4b0bdce not found: ID does not exist" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.268557 5200 scope.go:117] "RemoveContainer" containerID="01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.272187 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e\": container with ID starting with 01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e not found: ID does not exist" containerID="01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.272224 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e"} err="failed to get container status \"01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e\": rpc error: code = NotFound desc = could not find container \"01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e\": container with ID starting with 01c75fb36995a012edd4bac37428077853fb4a126d0d338c6b724b7a2f47732e not found: ID does not exist" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.272242 5200 scope.go:117] "RemoveContainer" containerID="6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.276362 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.280984 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.281238 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.285955 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.286016 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="adab4e53-c668-44cf-be5d-5802a77124aa" containerName="nova-scheduler-scheduler" probeResult="unknown" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.301100 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.308309 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d9fcdcd8b-tl47q"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.311831 5200 scope.go:117] "RemoveContainer" containerID="7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.328831 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5d9fcdcd8b-tl47q"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.341342 5200 scope.go:117] "RemoveContainer" containerID="6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.342156 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb\": container with ID starting with 6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb not found: ID does not exist" containerID="6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.342192 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb"} err="failed to get container status \"6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb\": rpc error: code = NotFound desc = could not find container \"6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb\": container with ID starting with 6c2ad61e07480e63c8e5b79910c7014834e6ab0562f5ae4cda5000442fab40eb not found: ID does not exist" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.342214 5200 scope.go:117] "RemoveContainer" containerID="7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.342766 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-operator-scripts\") pod \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\" (UID: \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.342895 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-logs\") pod \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.343981 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-logs" (OuterVolumeSpecName: "logs") pod "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" (UID: "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344434 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8db404a7-2c7c-4a0c-b21c-570bb473b1ff" (UID: "8db404a7-2c7c-4a0c-b21c-570bb473b1ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344492 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-internal-tls-certs\") pod \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344585 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-combined-ca-bundle\") pod \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.344584 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066\": container with ID starting with 7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066 not found: ID does not exist" containerID="7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344627 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066"} err="failed to get container status \"7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066\": rpc error: code = NotFound desc = could not find container \"7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066\": container with ID starting with 7bcbedf80093a79899896a6b62b1236b5c501d1039cdad8c2a8d0d1f29790066 not found: ID does not exist" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344660 5200 scope.go:117] "RemoveContainer" containerID="4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344667 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-public-tls-certs\") pod \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344751 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-config-data\") pod \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344781 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kwxc\" (UniqueName: \"kubernetes.io/projected/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-kube-api-access-2kwxc\") pod \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\" (UID: \"d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.344805 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxqbh\" (UniqueName: \"kubernetes.io/projected/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-kube-api-access-xxqbh\") pod \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\" (UID: \"8db404a7-2c7c-4a0c-b21c-570bb473b1ff\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.345128 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99b4k\" (UniqueName: \"kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k\") pod \"keystonebf07-account-delete-4m8mk\" (UID: \"ab55370d-6063-4303-b300-104b73319e88\") " pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.345166 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts\") pod \"keystonebf07-account-delete-4m8mk\" (UID: \"ab55370d-6063-4303-b300-104b73319e88\") " pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.345253 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lj2g5\" (UniqueName: \"kubernetes.io/projected/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f-kube-api-access-lj2g5\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.345264 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.345272 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.351891 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.351979 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts podName:ab55370d-6063-4303-b300-104b73319e88 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:12.351959318 +0000 UTC m=+1481.031161136 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts") pod "keystonebf07-account-delete-4m8mk" (UID: "ab55370d-6063-4303-b300-104b73319e88") : configmap "openstack-scripts" not found Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.355810 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-kube-api-access-2kwxc" (OuterVolumeSpecName: "kube-api-access-2kwxc") pod "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" (UID: "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0"). InnerVolumeSpecName "kube-api-access-2kwxc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.357334 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-kube-api-access-xxqbh" (OuterVolumeSpecName: "kube-api-access-xxqbh") pod "8db404a7-2c7c-4a0c-b21c-570bb473b1ff" (UID: "8db404a7-2c7c-4a0c-b21c-570bb473b1ff"). InnerVolumeSpecName "kube-api-access-xxqbh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.358361 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.359100 5200 projected.go:194] Error preparing data for projected volume kube-api-access-99b4k for pod openstack/keystonebf07-account-delete-4m8mk: failed to fetch token: serviceaccounts "galera-openstack" not found Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.359196 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k podName:ab55370d-6063-4303-b300-104b73319e88 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:12.359173915 +0000 UTC m=+1481.038375733 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-99b4k" (UniqueName: "kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k") pod "keystonebf07-account-delete-4m8mk" (UID: "ab55370d-6063-4303-b300-104b73319e88") : failed to fetch token: serviceaccounts "galera-openstack" not found Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.377644 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-config-data" (OuterVolumeSpecName: "config-data") pod "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" (UID: "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.378735 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.388622 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.389596 5200 scope.go:117] "RemoveContainer" containerID="4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.390986 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2\": container with ID starting with 4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2 not found: ID does not exist" containerID="4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.391018 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2"} err="failed to get container status \"4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2\": rpc error: code = NotFound desc = could not find container \"4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2\": container with ID starting with 4fa2b7b70262939b84e3cf1dbe9df22aedaf5c1b33415647d2a4fb95c343cdc2 not found: ID does not exist" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.391041 5200 scope.go:117] "RemoveContainer" containerID="956d596d413d6bc508604849be962ec96f688d90bc9bd6e3f6b5953f97e5186e" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.396268 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.400525 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" (UID: "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.402895 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" (UID: "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.441130 5200 scope.go:117] "RemoveContainer" containerID="994d0d2f29569bced7df7f345325a897433ec18de7893657a1375fbcb53bacf2" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.450208 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.450239 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.450251 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.450263 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kwxc\" (UniqueName: \"kubernetes.io/projected/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-kube-api-access-2kwxc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.450276 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxqbh\" (UniqueName: \"kubernetes.io/projected/8db404a7-2c7c-4a0c-b21c-570bb473b1ff-kube-api-access-xxqbh\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.451891 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" (UID: "d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.479128 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.488674 5200 scope.go:117] "RemoveContainer" containerID="6403883701f7e5dc62ff1b903104a5ab99aba9e00d1492955efcacde021d3e12" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.519345 5200 scope.go:117] "RemoveContainer" containerID="f8133fd16248ab3367c5964464b825f0d076ca73089fbe5d382745f4baeb4dd1" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.549177 5200 scope.go:117] "RemoveContainer" containerID="f77ef15f63cb09956b5bfe2dd8b835072429e31cb25ce434a16fdc1b92dc9dea" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.551370 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.551394 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.551438 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.551494 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts podName:225d875d-9c0b-4acd-9305-b5cd82ea9056 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:12.551462765 +0000 UTC m=+1481.230664593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts") pod "novaapi5c5d-account-delete-n94tm" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056") : configmap "openstack-scripts" not found Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.551527 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts podName:6d529441-c1c2-4f95-bc63-ce522b9f1b07 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:12.551520257 +0000 UTC m=+1481.230722075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts") pod "novacell0e30c-account-delete-pnqrp" (UID: "6d529441-c1c2-4f95-bc63-ce522b9f1b07") : configmap "openstack-scripts" not found Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.579523 5200 scope.go:117] "RemoveContainer" containerID="c58ee236e1d2b49c25a04da65637b68cbc07d2541bfa6191ac3c1b680febe5ff" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.612716 5200 scope.go:117] "RemoveContainer" containerID="a4649f26c430accc7c822a3b6948eaa51389ccaecd8fe7409b8405cd4514eba9" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.652736 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caee6c82-a418-4a1d-a364-0a0eb9073c09-operator-scripts\") pod \"caee6c82-a418-4a1d-a364-0a0eb9073c09\" (UID: \"caee6c82-a418-4a1d-a364-0a0eb9073c09\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.652809 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd4x8\" (UniqueName: \"kubernetes.io/projected/caee6c82-a418-4a1d-a364-0a0eb9073c09-kube-api-access-hd4x8\") pod \"caee6c82-a418-4a1d-a364-0a0eb9073c09\" (UID: \"caee6c82-a418-4a1d-a364-0a0eb9073c09\") " Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.653369 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caee6c82-a418-4a1d-a364-0a0eb9073c09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "caee6c82-a418-4a1d-a364-0a0eb9073c09" (UID: "caee6c82-a418-4a1d-a364-0a0eb9073c09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.656569 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caee6c82-a418-4a1d-a364-0a0eb9073c09-kube-api-access-hd4x8" (OuterVolumeSpecName: "kube-api-access-hd4x8") pod "caee6c82-a418-4a1d-a364-0a0eb9073c09" (UID: "caee6c82-a418-4a1d-a364-0a0eb9073c09"). InnerVolumeSpecName "kube-api-access-hd4x8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.755562 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caee6c82-a418-4a1d-a364-0a0eb9073c09-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.755611 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hd4x8\" (UniqueName: \"kubernetes.io/projected/caee6c82-a418-4a1d-a364-0a0eb9073c09-kube-api-access-hd4x8\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.922752 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.195:3000/\": dial tcp 10.217.0.195:3000: connect: connection refused" Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.958745 5200 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:10 crc kubenswrapper[5200]: E1126 13:55:10.958873 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data podName:4c5390ed-d89e-4450-adf3-1363b1150d1b nodeName:}" failed. No retries permitted until 2025-11-26 13:55:18.958830128 +0000 UTC m=+1487.638031946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b") : configmap "rabbitmq-cell1-config-data" not found Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.978960 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" path="/var/lib/kubelet/pods/046ac75d-4cfe-40ed-aca5-a5875ea641d8/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.980032 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" path="/var/lib/kubelet/pods/4bdf37de-cbcd-4971-a688-f3b91cf98b47/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.980638 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" path="/var/lib/kubelet/pods/4f6817f6-ce07-4b0f-a6c8-d667faa003a8/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.981837 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e57b83-77f4-446e-bc0a-c78f9094aacb" path="/var/lib/kubelet/pods/55e57b83-77f4-446e-bc0a-c78f9094aacb/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.982294 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58fe5504-07bd-4c4d-a2b4-3fbaa402649d" path="/var/lib/kubelet/pods/58fe5504-07bd-4c4d-a2b4-3fbaa402649d/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.982842 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7163ac1c-74a2-4383-bb8b-21a320eb6863" path="/var/lib/kubelet/pods/7163ac1c-74a2-4383-bb8b-21a320eb6863/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.983929 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743192ea-7611-42dc-ab35-0dd584590479" path="/var/lib/kubelet/pods/743192ea-7611-42dc-ab35-0dd584590479/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.984491 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" path="/var/lib/kubelet/pods/789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.985529 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" path="/var/lib/kubelet/pods/8484d8df-fa24-4e8e-85ba-c318fc6cde61/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.986081 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9254e1be-19ef-4af5-b8f8-ecea111851d9" path="/var/lib/kubelet/pods/9254e1be-19ef-4af5-b8f8-ecea111851d9/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.986633 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="937b077a-80f5-4f02-a636-68791a2cfe0f" path="/var/lib/kubelet/pods/937b077a-80f5-4f02-a636-68791a2cfe0f/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.987585 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" path="/var/lib/kubelet/pods/eab77ac1-a779-46ae-b0f6-7d39a99d1ecb/volumes" Nov 26 13:55:10 crc kubenswrapper[5200]: I1126 13:55:10.988085 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" path="/var/lib/kubelet/pods/ff2e56e0-4f2e-4101-b051-111cb2cbee56/volumes" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.105607 5200 generic.go:358] "Generic (PLEG): container finished" podID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" containerID="78c4512cda26fc1c5231bc7d98d45ac8739cbb21786dfbb0e443ec8c003043b6" exitCode=0 Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.105780 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5f5ea8d4-ba2b-4abe-b645-0513268421f9","Type":"ContainerDied","Data":"78c4512cda26fc1c5231bc7d98d45ac8739cbb21786dfbb0e443ec8c003043b6"} Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.124924 5200 generic.go:358] "Generic (PLEG): container finished" podID="4c5390ed-d89e-4450-adf3-1363b1150d1b" containerID="6b70899d8462595d68160840ba2f772a58ad47466f6d54dfc39939ee3313bcb9" exitCode=0 Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.125058 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c5390ed-d89e-4450-adf3-1363b1150d1b","Type":"ContainerDied","Data":"6b70899d8462595d68160840ba2f772a58ad47466f6d54dfc39939ee3313bcb9"} Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.147738 5200 generic.go:358] "Generic (PLEG): container finished" podID="27a62b3a-c69a-47fa-9642-425931d81827" containerID="9d12f78cd05b747844b762871d784048deb6a46596cfe397348c7103574c5d30" exitCode=0 Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.147860 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27a62b3a-c69a-47fa-9642-425931d81827","Type":"ContainerDied","Data":"9d12f78cd05b747844b762871d784048deb6a46596cfe397348c7103574c5d30"} Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.154195 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutronca07-account-delete-xb4zf" event={"ID":"caee6c82-a418-4a1d-a364-0a0eb9073c09","Type":"ContainerDied","Data":"ac3642ca604ff8048aa6fd689e3a6746e7d1963e0b4878c04453ee4daaefd0c5"} Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.154243 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac3642ca604ff8048aa6fd689e3a6746e7d1963e0b4878c04453ee4daaefd0c5" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.154312 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronca07-account-delete-xb4zf" Nov 26 13:55:11 crc kubenswrapper[5200]: E1126 13:55:11.164275 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 26 13:55:11 crc kubenswrapper[5200]: E1126 13:55:11.167664 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.172668 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonebf07-account-delete-4m8mk" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.172743 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0a47-account-delete-92fz8" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.172807 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.172911 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/novaapi5c5d-account-delete-n94tm" podUID="225d875d-9c0b-4acd-9305-b5cd82ea9056" containerName="mariadb-account-delete" containerID="cri-o://7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718" gracePeriod=30 Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.173023 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4584-account-delete-7mnx9" Nov 26 13:55:11 crc kubenswrapper[5200]: E1126 13:55:11.176304 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 26 13:55:11 crc kubenswrapper[5200]: E1126 13:55:11.176385 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="ovn-northd" probeResult="unknown" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.254372 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystonebf07-account-delete-4m8mk"] Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.263151 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystonebf07-account-delete-4m8mk"] Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.269164 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.293422 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.308175 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.365062 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab55370d-6063-4303-b300-104b73319e88-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.365097 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99b4k\" (UniqueName: \"kubernetes.io/projected/ab55370d-6063-4303-b300-104b73319e88-kube-api-access-99b4k\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466159 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-tls\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466349 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-server-conf\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466380 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-plugins-conf\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466404 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plp4v\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-kube-api-access-plp4v\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466469 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466551 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c5390ed-d89e-4450-adf3-1363b1150d1b-pod-info\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466568 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c5390ed-d89e-4450-adf3-1363b1150d1b-erlang-cookie-secret\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466613 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-confd\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466652 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466718 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-erlang-cookie\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.466744 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-plugins\") pod \"4c5390ed-d89e-4450-adf3-1363b1150d1b\" (UID: \"4c5390ed-d89e-4450-adf3-1363b1150d1b\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.468588 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.469740 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.474895 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.489251 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4c5390ed-d89e-4450-adf3-1363b1150d1b-pod-info" (OuterVolumeSpecName: "pod-info") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.516317 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.517230 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.517905 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c5390ed-d89e-4450-adf3-1363b1150d1b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.519530 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-kube-api-access-plp4v" (OuterVolumeSpecName: "kube-api-access-plp4v") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "kube-api-access-plp4v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.524785 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data" (OuterVolumeSpecName: "config-data") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.537933 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.554518 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-server-conf" (OuterVolumeSpecName: "server-conf") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571491 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571525 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571536 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571546 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571554 5200 reconciler_common.go:299] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-server-conf\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571562 5200 reconciler_common.go:299] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571571 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-plp4v\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-kube-api-access-plp4v\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571579 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c5390ed-d89e-4450-adf3-1363b1150d1b-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571586 5200 reconciler_common.go:299] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c5390ed-d89e-4450-adf3-1363b1150d1b-pod-info\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.571595 5200 reconciler_common.go:299] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c5390ed-d89e-4450-adf3-1363b1150d1b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.593581 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.607622 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.631896 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4c5390ed-d89e-4450-adf3-1363b1150d1b" (UID: "4c5390ed-d89e-4450-adf3-1363b1150d1b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.675970 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.676201 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-kolla-config\") pod \"27a62b3a-c69a-47fa-9642-425931d81827\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.676455 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-plugins-conf\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.676559 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-combined-ca-bundle\") pod \"27a62b3a-c69a-47fa-9642-425931d81827\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.676632 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "27a62b3a-c69a-47fa-9642-425931d81827" (UID: "27a62b3a-c69a-47fa-9642-425931d81827"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.676641 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-config-data-default\") pod \"27a62b3a-c69a-47fa-9642-425931d81827\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.676818 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf4bs\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-kube-api-access-wf4bs\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.676919 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-plugins\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677118 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f5ea8d4-ba2b-4abe-b645-0513268421f9-pod-info\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677197 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-tls\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677289 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-server-conf\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677359 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-erlang-cookie\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677426 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-galera-tls-certs\") pod \"27a62b3a-c69a-47fa-9642-425931d81827\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677572 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-operator-scripts\") pod \"27a62b3a-c69a-47fa-9642-425931d81827\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677662 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-confd\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677771 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"27a62b3a-c69a-47fa-9642-425931d81827\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677845 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f5ea8d4-ba2b-4abe-b645-0513268421f9-erlang-cookie-secret\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677923 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2xxz\" (UniqueName: \"kubernetes.io/projected/27a62b3a-c69a-47fa-9642-425931d81827-kube-api-access-n2xxz\") pod \"27a62b3a-c69a-47fa-9642-425931d81827\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.678076 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data\") pod \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\" (UID: \"5f5ea8d4-ba2b-4abe-b645-0513268421f9\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.678305 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27a62b3a-c69a-47fa-9642-425931d81827-config-data-generated\") pod \"27a62b3a-c69a-47fa-9642-425931d81827\" (UID: \"27a62b3a-c69a-47fa-9642-425931d81827\") " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.679338 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c5390ed-d89e-4450-adf3-1363b1150d1b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.679439 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.679627 5200 reconciler_common.go:299] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677006 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.677380 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "27a62b3a-c69a-47fa-9642-425931d81827" (UID: "27a62b3a-c69a-47fa-9642-425931d81827"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.680286 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a62b3a-c69a-47fa-9642-425931d81827-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "27a62b3a-c69a-47fa-9642-425931d81827" (UID: "27a62b3a-c69a-47fa-9642-425931d81827"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.681762 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.682008 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.682842 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27a62b3a-c69a-47fa-9642-425931d81827" (UID: "27a62b3a-c69a-47fa-9642-425931d81827"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.683549 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.686528 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-kube-api-access-wf4bs" (OuterVolumeSpecName: "kube-api-access-wf4bs") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "kube-api-access-wf4bs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.688465 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.688526 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5f5ea8d4-ba2b-4abe-b645-0513268421f9-pod-info" (OuterVolumeSpecName: "pod-info") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.697329 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f5ea8d4-ba2b-4abe-b645-0513268421f9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.709008 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27a62b3a-c69a-47fa-9642-425931d81827" (UID: "27a62b3a-c69a-47fa-9642-425931d81827"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.712800 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "27a62b3a-c69a-47fa-9642-425931d81827" (UID: "27a62b3a-c69a-47fa-9642-425931d81827"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.720894 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a62b3a-c69a-47fa-9642-425931d81827-kube-api-access-n2xxz" (OuterVolumeSpecName: "kube-api-access-n2xxz") pod "27a62b3a-c69a-47fa-9642-425931d81827" (UID: "27a62b3a-c69a-47fa-9642-425931d81827"). InnerVolumeSpecName "kube-api-access-n2xxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.724463 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data" (OuterVolumeSpecName: "config-data") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.764313 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "27a62b3a-c69a-47fa-9642-425931d81827" (UID: "27a62b3a-c69a-47fa-9642-425931d81827"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.780151 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-server-conf" (OuterVolumeSpecName: "server-conf") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781275 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781295 5200 reconciler_common.go:299] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f5ea8d4-ba2b-4abe-b645-0513268421f9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781306 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2xxz\" (UniqueName: \"kubernetes.io/projected/27a62b3a-c69a-47fa-9642-425931d81827-kube-api-access-n2xxz\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781315 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781324 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27a62b3a-c69a-47fa-9642-425931d81827-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781338 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781346 5200 reconciler_common.go:299] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781355 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781365 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781375 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wf4bs\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-kube-api-access-wf4bs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781383 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781391 5200 reconciler_common.go:299] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f5ea8d4-ba2b-4abe-b645-0513268421f9-pod-info\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781399 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781406 5200 reconciler_common.go:299] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f5ea8d4-ba2b-4abe-b645-0513268421f9-server-conf\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781414 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781422 5200 reconciler_common.go:299] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a62b3a-c69a-47fa-9642-425931d81827-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.781430 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a62b3a-c69a-47fa-9642-425931d81827-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.798501 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.802214 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.808293 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="58fe5504-07bd-4c4d-a2b4-3fbaa402649d" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.192:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.866893 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5f5ea8d4-ba2b-4abe-b645-0513268421f9" (UID: "5f5ea8d4-ba2b-4abe-b645-0513268421f9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.885418 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f5ea8d4-ba2b-4abe-b645-0513268421f9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.885453 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:11 crc kubenswrapper[5200]: I1126 13:55:11.885895 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.095318 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.096116 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.096400 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.096435 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" probeResult="unknown" Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.158858 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.161348 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.162966 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.163004 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" probeResult="unknown" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.185373 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5f5ea8d4-ba2b-4abe-b645-0513268421f9","Type":"ContainerDied","Data":"963f4c0429a8a71b49f8ba43583a756c78417e5f61ebaacc9a04799d278f16fb"} Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.185425 5200 scope.go:117] "RemoveContainer" containerID="78c4512cda26fc1c5231bc7d98d45ac8739cbb21786dfbb0e443ec8c003043b6" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.185591 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.200092 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c5390ed-d89e-4450-adf3-1363b1150d1b","Type":"ContainerDied","Data":"f3f4a3c3e89a0ab7552cee34135566562569879c24cc8008070380114ae9665e"} Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.200253 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.206894 5200 generic.go:358] "Generic (PLEG): container finished" podID="5dd57521-dfcc-4303-9f37-3485c7407851" containerID="6c6a86eadcfcca89403d03d1232bb2037cc1a1a3bd1de3d050a23ad833ce0858" exitCode=0 Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.207003 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" event={"ID":"5dd57521-dfcc-4303-9f37-3485c7407851","Type":"ContainerDied","Data":"6c6a86eadcfcca89403d03d1232bb2037cc1a1a3bd1de3d050a23ad833ce0858"} Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.215611 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27a62b3a-c69a-47fa-9642-425931d81827","Type":"ContainerDied","Data":"49e6428f30ffa795a18306705bd3fe6ce12473027bed514dcdf8505f5e67e50e"} Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.215784 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.232384 5200 generic.go:358] "Generic (PLEG): container finished" podID="11e2f1eb-442a-4f5c-845c-1ecf10744fd1" containerID="38cdc95cb224169d9fc8e6fa2db145ec76be63b846d7793abc56e1bd23843e4c" exitCode=0 Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.232523 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-557444d9fd-t9lq7" event={"ID":"11e2f1eb-442a-4f5c-845c-1ecf10744fd1","Type":"ContainerDied","Data":"38cdc95cb224169d9fc8e6fa2db145ec76be63b846d7793abc56e1bd23843e4c"} Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.235833 5200 generic.go:358] "Generic (PLEG): container finished" podID="adab4e53-c668-44cf-be5d-5802a77124aa" containerID="76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" exitCode=0 Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.235934 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adab4e53-c668-44cf-be5d-5802a77124aa","Type":"ContainerDied","Data":"76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba"} Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.242471 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_12479106-7c82-418c-b017-03e5fcc4f018/ovn-northd/0.log" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.242509 5200 generic.go:358] "Generic (PLEG): container finished" podID="12479106-7c82-418c-b017-03e5fcc4f018" containerID="70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" exitCode=139 Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.242677 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"12479106-7c82-418c-b017-03e5fcc4f018","Type":"ContainerDied","Data":"70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141"} Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.254992 5200 generic.go:358] "Generic (PLEG): container finished" podID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerID="52bb01fb6b98b6d753a0b12d991a396878b553610972bbf7719db7a828b9c734" exitCode=0 Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.255076 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerDied","Data":"52bb01fb6b98b6d753a0b12d991a396878b553610972bbf7719db7a828b9c734"} Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.257016 5200 generic.go:358] "Generic (PLEG): container finished" podID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerID="da8e534c170ae59a27fe6be7bcd6d2dc6851930207b80fde5ade7f1d41b386bc" exitCode=0 Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.257051 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7544499f5f-8pt52" event={"ID":"0a85e23d-55dd-4d11-b304-aeb59804f47d","Type":"ContainerDied","Data":"da8e534c170ae59a27fe6be7bcd6d2dc6851930207b80fde5ade7f1d41b386bc"} Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.327746 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8 is running failed: container process not found" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.328066 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8 is running failed: container process not found" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.330930 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8 is running failed: container process not found" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.331036 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="aae8bd0a-01d7-497d-b7fb-26c495823af7" containerName="nova-cell1-conductor-conductor" probeResult="unknown" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.490163 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_12479106-7c82-418c-b017-03e5fcc4f018/ovn-northd/0.log" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.490252 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.512632 5200 scope.go:117] "RemoveContainer" containerID="76726c7914d31f34afba939283a29384999603044a931c1bcc9aba2f60a43f99" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.515312 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.524496 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.530118 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.540843 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.552150 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.563372 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.579720 5200 scope.go:117] "RemoveContainer" containerID="6b70899d8462595d68160840ba2f772a58ad47466f6d54dfc39939ee3313bcb9" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.597502 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4b9z9"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.603258 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-ovn-northd-tls-certs\") pod \"12479106-7c82-418c-b017-03e5fcc4f018\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.603456 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-config-data\") pod \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.603578 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22xrq\" (UniqueName: \"kubernetes.io/projected/0a85e23d-55dd-4d11-b304-aeb59804f47d-kube-api-access-22xrq\") pod \"0a85e23d-55dd-4d11-b304-aeb59804f47d\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.603660 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/12479106-7c82-418c-b017-03e5fcc4f018-ovn-rundir\") pod \"12479106-7c82-418c-b017-03e5fcc4f018\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.603807 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-ceilometer-tls-certs\") pod \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.603875 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-log-httpd\") pod \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.603977 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a85e23d-55dd-4d11-b304-aeb59804f47d-logs\") pod \"0a85e23d-55dd-4d11-b304-aeb59804f47d\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.604056 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-run-httpd\") pod \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.604149 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data-custom\") pod \"0a85e23d-55dd-4d11-b304-aeb59804f47d\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.604261 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-combined-ca-bundle\") pod \"0a85e23d-55dd-4d11-b304-aeb59804f47d\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.604371 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data\") pod \"0a85e23d-55dd-4d11-b304-aeb59804f47d\" (UID: \"0a85e23d-55dd-4d11-b304-aeb59804f47d\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.604517 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-metrics-certs-tls-certs\") pod \"12479106-7c82-418c-b017-03e5fcc4f018\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.604669 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-combined-ca-bundle\") pod \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.604814 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-config\") pod \"12479106-7c82-418c-b017-03e5fcc4f018\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.604918 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-combined-ca-bundle\") pod \"12479106-7c82-418c-b017-03e5fcc4f018\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.605014 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6vk7\" (UniqueName: \"kubernetes.io/projected/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-kube-api-access-m6vk7\") pod \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.605162 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-sg-core-conf-yaml\") pod \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.605257 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ccmg\" (UniqueName: \"kubernetes.io/projected/12479106-7c82-418c-b017-03e5fcc4f018-kube-api-access-5ccmg\") pod \"12479106-7c82-418c-b017-03e5fcc4f018\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.605404 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-scripts\") pod \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\" (UID: \"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.605552 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-scripts\") pod \"12479106-7c82-418c-b017-03e5fcc4f018\" (UID: \"12479106-7c82-418c-b017-03e5fcc4f018\") " Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.612627 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a85e23d-55dd-4d11-b304-aeb59804f47d-logs" (OuterVolumeSpecName: "logs") pod "0a85e23d-55dd-4d11-b304-aeb59804f47d" (UID: "0a85e23d-55dd-4d11-b304-aeb59804f47d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.612906 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12479106-7c82-418c-b017-03e5fcc4f018-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "12479106-7c82-418c-b017-03e5fcc4f018" (UID: "12479106-7c82-418c-b017-03e5fcc4f018"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.619293 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" (UID: "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.620396 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-config" (OuterVolumeSpecName: "config") pod "12479106-7c82-418c-b017-03e5fcc4f018" (UID: "12479106-7c82-418c-b017-03e5fcc4f018"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.621322 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-scripts" (OuterVolumeSpecName: "scripts") pod "12479106-7c82-418c-b017-03e5fcc4f018" (UID: "12479106-7c82-418c-b017-03e5fcc4f018"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.621414 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.621469 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts podName:225d875d-9c0b-4acd-9305-b5cd82ea9056 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:16.621450656 +0000 UTC m=+1485.300652474 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts") pod "novaapi5c5d-account-delete-n94tm" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056") : configmap "openstack-scripts" not found Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.622546 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:12 crc kubenswrapper[5200]: E1126 13:55:12.622584 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts podName:6d529441-c1c2-4f95-bc63-ce522b9f1b07 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:16.622575755 +0000 UTC m=+1485.301777573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts") pod "novacell0e30c-account-delete-pnqrp" (UID: "6d529441-c1c2-4f95-bc63-ce522b9f1b07") : configmap "openstack-scripts" not found Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.625122 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a85e23d-55dd-4d11-b304-aeb59804f47d-kube-api-access-22xrq" (OuterVolumeSpecName: "kube-api-access-22xrq") pod "0a85e23d-55dd-4d11-b304-aeb59804f47d" (UID: "0a85e23d-55dd-4d11-b304-aeb59804f47d"). InnerVolumeSpecName "kube-api-access-22xrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.625993 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" (UID: "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.646653 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-scripts" (OuterVolumeSpecName: "scripts") pod "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" (UID: "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.653706 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a85e23d-55dd-4d11-b304-aeb59804f47d" (UID: "0a85e23d-55dd-4d11-b304-aeb59804f47d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.658909 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12479106-7c82-418c-b017-03e5fcc4f018-kube-api-access-5ccmg" (OuterVolumeSpecName: "kube-api-access-5ccmg") pod "12479106-7c82-418c-b017-03e5fcc4f018" (UID: "12479106-7c82-418c-b017-03e5fcc4f018"). InnerVolumeSpecName "kube-api-access-5ccmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.661325 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0a85e23d-55dd-4d11-b304-aeb59804f47d" (UID: "0a85e23d-55dd-4d11-b304-aeb59804f47d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.669144 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4b9z9"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.671902 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-kube-api-access-m6vk7" (OuterVolumeSpecName: "kube-api-access-m6vk7") pod "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" (UID: "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31"). InnerVolumeSpecName "kube-api-access-m6vk7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.680571 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.686510 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-4584-account-create-update-cnvc7"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.692227 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.695982 5200 scope.go:117] "RemoveContainer" containerID="fc8ecc37de36174c47bf0b036cad135c5bf8af671bf2e44d522e81e85324b227" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.699084 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4584-account-create-update-cnvc7"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708165 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708195 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708204 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22xrq\" (UniqueName: \"kubernetes.io/projected/0a85e23d-55dd-4d11-b304-aeb59804f47d-kube-api-access-22xrq\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708214 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/12479106-7c82-418c-b017-03e5fcc4f018-ovn-rundir\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708223 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708234 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a85e23d-55dd-4d11-b304-aeb59804f47d-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708242 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708250 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708258 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708266 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12479106-7c82-418c-b017-03e5fcc4f018-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708273 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m6vk7\" (UniqueName: \"kubernetes.io/projected/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-kube-api-access-m6vk7\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.708281 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ccmg\" (UniqueName: \"kubernetes.io/projected/12479106-7c82-418c-b017-03e5fcc4f018-kube-api-access-5ccmg\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.718347 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance4584-account-delete-7mnx9"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.730003 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance4584-account-delete-7mnx9"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.737671 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" (UID: "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.742631 5200 scope.go:117] "RemoveContainer" containerID="9d12f78cd05b747844b762871d784048deb6a46596cfe397348c7103574c5d30" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.745830 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12479106-7c82-418c-b017-03e5fcc4f018" (UID: "12479106-7c82-418c-b017-03e5fcc4f018"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.746930 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-69qj9"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.753852 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-69qj9"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.760198 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinderf459-account-delete-h8qgw"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.767511 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f459-account-create-update-rcg98"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.776633 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f459-account-create-update-rcg98"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.786544 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinderf459-account-delete-h8qgw"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.810805 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.810837 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.812996 5200 scope.go:117] "RemoveContainer" containerID="4314dc8e7e3ff2e877888334a17d2850ed752614b00e829594c4b4e7858e6db5" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.815058 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" (UID: "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.852174 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-h9gqm"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.862057 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-h9gqm"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.869465 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-e6ea-account-create-update-4jxg5"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.875250 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e6ea-account-create-update-4jxg5"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.890801 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" (UID: "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.912092 5200 reconciler_common.go:299] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.912349 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.933775 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-config-data" (OuterVolumeSpecName: "config-data") pod "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" (UID: "edfe6fae-0a04-415d-8cd7-ddf1dddf7d31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.935145 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "12479106-7c82-418c-b017-03e5fcc4f018" (UID: "12479106-7c82-418c-b017-03e5fcc4f018"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.947196 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data" (OuterVolumeSpecName: "config-data") pod "0a85e23d-55dd-4d11-b304-aeb59804f47d" (UID: "0a85e23d-55dd-4d11-b304-aeb59804f47d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.950021 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placemente6ea-account-delete-s2v6r"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.958071 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placemente6ea-account-delete-s2v6r"] Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.962432 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "12479106-7c82-418c-b017-03e5fcc4f018" (UID: "12479106-7c82-418c-b017-03e5fcc4f018"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.988327 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05534089-21c2-4d8c-9b9e-d21c7082e727" path="/var/lib/kubelet/pods/05534089-21c2-4d8c-9b9e-d21c7082e727/volumes" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.988964 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0744fa2d-9b22-413a-8d7d-549c7c5c1fac" path="/var/lib/kubelet/pods/0744fa2d-9b22-413a-8d7d-549c7c5c1fac/volumes" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.989831 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da" path="/var/lib/kubelet/pods/1e4ee17b-ef5e-4dd6-aa2b-7ffa709193da/volumes" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.990946 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a62b3a-c69a-47fa-9642-425931d81827" path="/var/lib/kubelet/pods/27a62b3a-c69a-47fa-9642-425931d81827/volumes" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.992174 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5390ed-d89e-4450-adf3-1363b1150d1b" path="/var/lib/kubelet/pods/4c5390ed-d89e-4450-adf3-1363b1150d1b/volumes" Nov 26 13:55:12 crc kubenswrapper[5200]: I1126 13:55:12.996285 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.006018 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb9e671-fdb0-40d8-ab5b-df7608f8be45" path="/var/lib/kubelet/pods/4fb9e671-fdb0-40d8-ab5b-df7608f8be45/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.007173 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" path="/var/lib/kubelet/pods/5f5ea8d4-ba2b-4abe-b645-0513268421f9/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.007894 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db404a7-2c7c-4a0c-b21c-570bb473b1ff" path="/var/lib/kubelet/pods/8db404a7-2c7c-4a0c-b21c-570bb473b1ff/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.012341 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6ca788-d1ff-472b-905b-98575eb7a814" path="/var/lib/kubelet/pods/aa6ca788-d1ff-472b-905b-98575eb7a814/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.012851 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab255459-89c5-462e-a89d-eb26ba377f79" path="/var/lib/kubelet/pods/ab255459-89c5-462e-a89d-eb26ba377f79/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.015407 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab55370d-6063-4303-b300-104b73319e88" path="/var/lib/kubelet/pods/ab55370d-6063-4303-b300-104b73319e88/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.015869 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data-custom\") pod \"5dd57521-dfcc-4303-9f37-3485c7407851\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.015996 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data\") pod \"5dd57521-dfcc-4303-9f37-3485c7407851\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.016379 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57a6d2a-4df7-4101-a9ac-edd0e76be131" path="/var/lib/kubelet/pods/b57a6d2a-4df7-4101-a9ac-edd0e76be131/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.017221 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" path="/var/lib/kubelet/pods/d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.018041 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd57521-dfcc-4303-9f37-3485c7407851-logs\") pod \"5dd57521-dfcc-4303-9f37-3485c7407851\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.019498 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-combined-ca-bundle\") pod \"5dd57521-dfcc-4303-9f37-3485c7407851\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.019597 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnwxq\" (UniqueName: \"kubernetes.io/projected/5dd57521-dfcc-4303-9f37-3485c7407851-kube-api-access-gnwxq\") pod \"5dd57521-dfcc-4303-9f37-3485c7407851\" (UID: \"5dd57521-dfcc-4303-9f37-3485c7407851\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.019973 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd57521-dfcc-4303-9f37-3485c7407851-logs" (OuterVolumeSpecName: "logs") pod "5dd57521-dfcc-4303-9f37-3485c7407851" (UID: "5dd57521-dfcc-4303-9f37-3485c7407851"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.026611 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd57521-dfcc-4303-9f37-3485c7407851-kube-api-access-gnwxq" (OuterVolumeSpecName: "kube-api-access-gnwxq") pod "5dd57521-dfcc-4303-9f37-3485c7407851" (UID: "5dd57521-dfcc-4303-9f37-3485c7407851"). InnerVolumeSpecName "kube-api-access-gnwxq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.034403 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5dd57521-dfcc-4303-9f37-3485c7407851" (UID: "5dd57521-dfcc-4303-9f37-3485c7407851"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.035128 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4" path="/var/lib/kubelet/pods/fe8fae71-4fe0-40cf-bcbc-7ca5e950aef4/volumes" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.036094 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a85e23d-55dd-4d11-b304-aeb59804f47d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.036191 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gnwxq\" (UniqueName: \"kubernetes.io/projected/5dd57521-dfcc-4303-9f37-3485c7407851-kube-api-access-gnwxq\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.036265 5200 reconciler_common.go:299] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.036326 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.036387 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/12479106-7c82-418c-b017-03e5fcc4f018-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.036443 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.036501 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd57521-dfcc-4303-9f37-3485c7407851-logs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.052467 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.076888 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.083215 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.092142 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dd57521-dfcc-4303-9f37-3485c7407851" (UID: "5dd57521-dfcc-4303-9f37-3485c7407851"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.097886 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.117508 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data" (OuterVolumeSpecName: "config-data") pod "5dd57521-dfcc-4303-9f37-3485c7407851" (UID: "5dd57521-dfcc-4303-9f37-3485c7407851"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138459 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-combined-ca-bundle\") pod \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138519 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-combined-ca-bundle\") pod \"aae8bd0a-01d7-497d-b7fb-26c495823af7\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138582 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-fernet-keys\") pod \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138625 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkhvm\" (UniqueName: \"kubernetes.io/projected/adab4e53-c668-44cf-be5d-5802a77124aa-kube-api-access-xkhvm\") pod \"adab4e53-c668-44cf-be5d-5802a77124aa\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138729 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-combined-ca-bundle\") pod \"adab4e53-c668-44cf-be5d-5802a77124aa\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138755 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-scripts\") pod \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138832 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4n7m\" (UniqueName: \"kubernetes.io/projected/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-kube-api-access-p4n7m\") pod \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138912 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-config-data\") pod \"aae8bd0a-01d7-497d-b7fb-26c495823af7\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.138938 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-credential-keys\") pod \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139012 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-public-tls-certs\") pod \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139041 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z9qn\" (UniqueName: \"kubernetes.io/projected/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-kube-api-access-4z9qn\") pod \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139065 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-config-data\") pod \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139121 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztnfv\" (UniqueName: \"kubernetes.io/projected/aae8bd0a-01d7-497d-b7fb-26c495823af7-kube-api-access-ztnfv\") pod \"aae8bd0a-01d7-497d-b7fb-26c495823af7\" (UID: \"aae8bd0a-01d7-497d-b7fb-26c495823af7\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139149 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-config-data\") pod \"adab4e53-c668-44cf-be5d-5802a77124aa\" (UID: \"adab4e53-c668-44cf-be5d-5802a77124aa\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139211 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-config-data\") pod \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\" (UID: \"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139236 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-combined-ca-bundle\") pod \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139274 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-internal-tls-certs\") pod \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\" (UID: \"11e2f1eb-442a-4f5c-845c-1ecf10744fd1\") " Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139772 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.139806 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd57521-dfcc-4303-9f37-3485c7407851-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.159174 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adab4e53-c668-44cf-be5d-5802a77124aa-kube-api-access-xkhvm" (OuterVolumeSpecName: "kube-api-access-xkhvm") pod "adab4e53-c668-44cf-be5d-5802a77124aa" (UID: "adab4e53-c668-44cf-be5d-5802a77124aa"). InnerVolumeSpecName "kube-api-access-xkhvm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.164380 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "11e2f1eb-442a-4f5c-845c-1ecf10744fd1" (UID: "11e2f1eb-442a-4f5c-845c-1ecf10744fd1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.173107 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7lcx4"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.179392 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7lcx4"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.193105 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae8bd0a-01d7-497d-b7fb-26c495823af7-kube-api-access-ztnfv" (OuterVolumeSpecName: "kube-api-access-ztnfv") pod "aae8bd0a-01d7-497d-b7fb-26c495823af7" (UID: "aae8bd0a-01d7-497d-b7fb-26c495823af7"). InnerVolumeSpecName "kube-api-access-ztnfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.200831 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-scripts" (OuterVolumeSpecName: "scripts") pod "11e2f1eb-442a-4f5c-845c-1ecf10744fd1" (UID: "11e2f1eb-442a-4f5c-845c-1ecf10744fd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.207510 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "11e2f1eb-442a-4f5c-845c-1ecf10744fd1" (UID: "11e2f1eb-442a-4f5c-845c-1ecf10744fd1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.207747 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-kube-api-access-4z9qn" (OuterVolumeSpecName: "kube-api-access-4z9qn") pod "8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" (UID: "8edf3f05-24ba-41fa-82bd-3a4e57a1cee6"). InnerVolumeSpecName "kube-api-access-4z9qn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.210497 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-kube-api-access-p4n7m" (OuterVolumeSpecName: "kube-api-access-p4n7m") pod "11e2f1eb-442a-4f5c-845c-1ecf10744fd1" (UID: "11e2f1eb-442a-4f5c-845c-1ecf10744fd1"). InnerVolumeSpecName "kube-api-access-p4n7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.211441 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-config-data" (OuterVolumeSpecName: "config-data") pod "aae8bd0a-01d7-497d-b7fb-26c495823af7" (UID: "aae8bd0a-01d7-497d-b7fb-26c495823af7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.214750 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-config-data" (OuterVolumeSpecName: "config-data") pod "11e2f1eb-442a-4f5c-845c-1ecf10744fd1" (UID: "11e2f1eb-442a-4f5c-845c-1ecf10744fd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.215917 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-config-data" (OuterVolumeSpecName: "config-data") pod "8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" (UID: "8edf3f05-24ba-41fa-82bd-3a4e57a1cee6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.221508 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutronca07-account-delete-xb4zf"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.235761 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11e2f1eb-442a-4f5c-845c-1ecf10744fd1" (UID: "11e2f1eb-442a-4f5c-845c-1ecf10744fd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.244284 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adab4e53-c668-44cf-be5d-5802a77124aa" (UID: "adab4e53-c668-44cf-be5d-5802a77124aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245318 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245369 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4n7m\" (UniqueName: \"kubernetes.io/projected/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-kube-api-access-p4n7m\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245382 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245395 5200 reconciler_common.go:299] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245405 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4z9qn\" (UniqueName: \"kubernetes.io/projected/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-kube-api-access-4z9qn\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245414 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245425 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ztnfv\" (UniqueName: \"kubernetes.io/projected/aae8bd0a-01d7-497d-b7fb-26c495823af7-kube-api-access-ztnfv\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245433 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245442 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245451 5200 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.245460 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xkhvm\" (UniqueName: \"kubernetes.io/projected/adab4e53-c668-44cf-be5d-5802a77124aa-kube-api-access-xkhvm\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.249718 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" (UID: "8edf3f05-24ba-41fa-82bd-3a4e57a1cee6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.256240 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutronca07-account-delete-xb4zf"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.275038 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "11e2f1eb-442a-4f5c-845c-1ecf10744fd1" (UID: "11e2f1eb-442a-4f5c-845c-1ecf10744fd1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.275221 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ca07-account-create-update-zddj2"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.275878 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-config-data" (OuterVolumeSpecName: "config-data") pod "adab4e53-c668-44cf-be5d-5802a77124aa" (UID: "adab4e53-c668-44cf-be5d-5802a77124aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.276930 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aae8bd0a-01d7-497d-b7fb-26c495823af7" (UID: "aae8bd0a-01d7-497d-b7fb-26c495823af7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.278339 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.280875 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55d547fcd6-blkmf" event={"ID":"5dd57521-dfcc-4303-9f37-3485c7407851","Type":"ContainerDied","Data":"037187032bb8ed0fb52dcce979cf8569f7b3faa56b155c563a3e7543d2392b63"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.280948 5200 scope.go:117] "RemoveContainer" containerID="6c6a86eadcfcca89403d03d1232bb2037cc1a1a3bd1de3d050a23ad833ce0858" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.283308 5200 generic.go:358] "Generic (PLEG): container finished" podID="8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" containerID="30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851" exitCode=0 Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.283434 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6","Type":"ContainerDied","Data":"30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.283906 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8edf3f05-24ba-41fa-82bd-3a4e57a1cee6","Type":"ContainerDied","Data":"83dc709ebbea8111c9b9ced4f84a4d02d0ffc9256e56133973446b3f6feb457e"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.283926 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ca07-account-create-update-zddj2"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.284017 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.286812 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "11e2f1eb-442a-4f5c-845c-1ecf10744fd1" (UID: "11e2f1eb-442a-4f5c-845c-1ecf10744fd1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.292431 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-txf2x"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.299003 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-txf2x"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.303222 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-557444d9fd-t9lq7" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.303249 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-557444d9fd-t9lq7" event={"ID":"11e2f1eb-442a-4f5c-845c-1ecf10744fd1","Type":"ContainerDied","Data":"0951a95ec5b27dd8b9357ffa17683cda7d8b9917387fd66d3cfb78e3ab66b742"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.306808 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0a47-account-create-update-n4bxl"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.309920 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.309947 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adab4e53-c668-44cf-be5d-5802a77124aa","Type":"ContainerDied","Data":"766214c0e0a3d2c44051458267145692b94ccfa19f1796075fe1fd732a8f4575"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.313559 5200 scope.go:117] "RemoveContainer" containerID="f9fa9adbbf6d1354d89225638d8e5a8383e2d9b6d0d928de2d7dc4db987750c0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.314447 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_12479106-7c82-418c-b017-03e5fcc4f018/ovn-northd/0.log" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.314566 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"12479106-7c82-418c-b017-03e5fcc4f018","Type":"ContainerDied","Data":"6fad5f9ac71a608927bf373e740c42fd1061b54d5ef59a0fb256f2b53ad1b2d8"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.314608 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.318726 5200 generic.go:358] "Generic (PLEG): container finished" podID="aae8bd0a-01d7-497d-b7fb-26c495823af7" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" exitCode=0 Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.318985 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aae8bd0a-01d7-497d-b7fb-26c495823af7","Type":"ContainerDied","Data":"26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.319045 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"aae8bd0a-01d7-497d-b7fb-26c495823af7","Type":"ContainerDied","Data":"0a42a963ec31c8af8a7becd2d3868047ff5ae811d600960f93340e53053d2c31"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.319156 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.321667 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican0a47-account-delete-92fz8"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.333386 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edfe6fae-0a04-415d-8cd7-ddf1dddf7d31","Type":"ContainerDied","Data":"ca264c34083f559cbdf6d65c618fba39485e4e5b470cbbf0162a97b8278f710d"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.333573 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.337209 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0a47-account-create-update-n4bxl"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.346666 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.346974 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.347048 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae8bd0a-01d7-497d-b7fb-26c495823af7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.347106 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.347186 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e2f1eb-442a-4f5c-845c-1ecf10744fd1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.347250 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adab4e53-c668-44cf-be5d-5802a77124aa-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.347319 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican0a47-account-delete-92fz8"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.349725 5200 scope.go:117] "RemoveContainer" containerID="30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.351018 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7544499f5f-8pt52" event={"ID":"0a85e23d-55dd-4d11-b304-aeb59804f47d","Type":"ContainerDied","Data":"837dc8f816442390ac263aa5e594d57578d80c828712f35b3d6e908392d57d69"} Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.351126 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7544499f5f-8pt52" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.361367 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-55d547fcd6-blkmf"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.387370 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-55d547fcd6-blkmf"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.390667 5200 scope.go:117] "RemoveContainer" containerID="30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851" Nov 26 13:55:13 crc kubenswrapper[5200]: E1126 13:55:13.391135 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851\": container with ID starting with 30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851 not found: ID does not exist" containerID="30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.391170 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851"} err="failed to get container status \"30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851\": rpc error: code = NotFound desc = could not find container \"30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851\": container with ID starting with 30b1a56a14bf1b83dd4f770d6f75bab030f423cfd1f0118a33f9373ca984b851 not found: ID does not exist" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.391198 5200 scope.go:117] "RemoveContainer" containerID="38cdc95cb224169d9fc8e6fa2db145ec76be63b846d7793abc56e1bd23843e4c" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.396069 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.402480 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.411101 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.418017 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.427577 5200 scope.go:117] "RemoveContainer" containerID="76a9d9a1f64b6c4ffef9c97b17a814073c3a7a109c7e23a6e73c8c590a0c51ba" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.427787 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-557444d9fd-t9lq7"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.435730 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-557444d9fd-t9lq7"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.440864 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7544499f5f-8pt52"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.446052 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7544499f5f-8pt52"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.449629 5200 scope.go:117] "RemoveContainer" containerID="5a4f406803509ed4c9f81a25d1d85f51801ee9e253c61023a592d2368060e342" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.452459 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.457110 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.462148 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.466803 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.473423 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.475497 5200 scope.go:117] "RemoveContainer" containerID="70813568bd1b496f9712e2f6a64e51693f4996707b3652f71f1eba3bb8950141" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.478484 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.502476 5200 scope.go:117] "RemoveContainer" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.528841 5200 scope.go:117] "RemoveContainer" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" Nov 26 13:55:13 crc kubenswrapper[5200]: E1126 13:55:13.529336 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8\": container with ID starting with 26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8 not found: ID does not exist" containerID="26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.529364 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8"} err="failed to get container status \"26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8\": rpc error: code = NotFound desc = could not find container \"26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8\": container with ID starting with 26d9971fd5f2c5434129885663f8f0407778b5822c822965faf65f7b651330b8 not found: ID does not exist" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.529385 5200 scope.go:117] "RemoveContainer" containerID="b9326640d2dd0dc2cb465cceabfeb8666c56ac7ec888f7a972c9d1baf978168e" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.556248 5200 scope.go:117] "RemoveContainer" containerID="c12f4bdddf37a1ef5e860426fe948bdb51ce3f0edf70735ec8c9ab110cb59d7e" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.575356 5200 scope.go:117] "RemoveContainer" containerID="52bb01fb6b98b6d753a0b12d991a396878b553610972bbf7719db7a828b9c734" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.595971 5200 scope.go:117] "RemoveContainer" containerID="b908cb3b6ebf4aeb47d606fa2e54c25261e51e4dbd764b3dd457699d4b48474a" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.614350 5200 scope.go:117] "RemoveContainer" containerID="da8e534c170ae59a27fe6be7bcd6d2dc6851930207b80fde5ade7f1d41b386bc" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.637539 5200 scope.go:117] "RemoveContainer" containerID="fea23ba732bf069b37174feb4444f8fe8af7b5feb6c583a3ca3c7cf599079d2c" Nov 26 13:55:13 crc kubenswrapper[5200]: I1126 13:55:13.826608 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d9fcdcd8b-tl47q" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.161:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.387588 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.188:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.985633 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550" path="/var/lib/kubelet/pods/0569f2fb-6ff0-4cf0-bb17-b2b6fd2a1550/volumes" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.987020 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" path="/var/lib/kubelet/pods/0a85e23d-55dd-4d11-b304-aeb59804f47d/volumes" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.988260 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e2f1eb-442a-4f5c-845c-1ecf10744fd1" path="/var/lib/kubelet/pods/11e2f1eb-442a-4f5c-845c-1ecf10744fd1/volumes" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.990540 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12479106-7c82-418c-b017-03e5fcc4f018" path="/var/lib/kubelet/pods/12479106-7c82-418c-b017-03e5fcc4f018/volumes" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.991848 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bbc6b47-81e8-4999-9cd8-1744a64e8158" path="/var/lib/kubelet/pods/2bbc6b47-81e8-4999-9cd8-1744a64e8158/volumes" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.993082 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5179fdd0-fdef-4ddd-8d4f-0e13054dd47a" path="/var/lib/kubelet/pods/5179fdd0-fdef-4ddd-8d4f-0e13054dd47a/volumes" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.995624 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" path="/var/lib/kubelet/pods/5dd57521-dfcc-4303-9f37-3485c7407851/volumes" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.996997 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac" path="/var/lib/kubelet/pods/7f0d85c7-7ad6-4e48-b3db-5c82ecfaf6ac/volumes" Nov 26 13:55:14 crc kubenswrapper[5200]: I1126 13:55:14.998217 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" path="/var/lib/kubelet/pods/8edf3f05-24ba-41fa-82bd-3a4e57a1cee6/volumes" Nov 26 13:55:15 crc kubenswrapper[5200]: I1126 13:55:14.999356 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdc5a98-ccb9-4059-ba29-729ad8af9c4f" path="/var/lib/kubelet/pods/8fdc5a98-ccb9-4059-ba29-729ad8af9c4f/volumes" Nov 26 13:55:15 crc kubenswrapper[5200]: I1126 13:55:14.999923 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae8bd0a-01d7-497d-b7fb-26c495823af7" path="/var/lib/kubelet/pods/aae8bd0a-01d7-497d-b7fb-26c495823af7/volumes" Nov 26 13:55:15 crc kubenswrapper[5200]: I1126 13:55:15.000422 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adab4e53-c668-44cf-be5d-5802a77124aa" path="/var/lib/kubelet/pods/adab4e53-c668-44cf-be5d-5802a77124aa/volumes" Nov 26 13:55:15 crc kubenswrapper[5200]: I1126 13:55:15.000919 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caee6c82-a418-4a1d-a364-0a0eb9073c09" path="/var/lib/kubelet/pods/caee6c82-a418-4a1d-a364-0a0eb9073c09/volumes" Nov 26 13:55:15 crc kubenswrapper[5200]: I1126 13:55:15.002251 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" path="/var/lib/kubelet/pods/edfe6fae-0a04-415d-8cd7-ddf1dddf7d31/volumes" Nov 26 13:55:16 crc kubenswrapper[5200]: E1126 13:55:16.679781 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:16 crc kubenswrapper[5200]: E1126 13:55:16.680121 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts podName:225d875d-9c0b-4acd-9305-b5cd82ea9056 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:24.680100836 +0000 UTC m=+1493.359302654 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts") pod "novaapi5c5d-account-delete-n94tm" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056") : configmap "openstack-scripts" not found Nov 26 13:55:16 crc kubenswrapper[5200]: E1126 13:55:16.679846 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:16 crc kubenswrapper[5200]: E1126 13:55:16.680250 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts podName:6d529441-c1c2-4f95-bc63-ce522b9f1b07 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:24.68022847 +0000 UTC m=+1493.359430358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts") pod "novacell0e30c-account-delete-pnqrp" (UID: "6d529441-c1c2-4f95-bc63-ce522b9f1b07") : configmap "openstack-scripts" not found Nov 26 13:55:17 crc kubenswrapper[5200]: E1126 13:55:17.087725 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:17 crc kubenswrapper[5200]: E1126 13:55:17.088368 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:17 crc kubenswrapper[5200]: E1126 13:55:17.088670 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:17 crc kubenswrapper[5200]: E1126 13:55:17.088825 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" probeResult="unknown" Nov 26 13:55:17 crc kubenswrapper[5200]: E1126 13:55:17.161624 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:17 crc kubenswrapper[5200]: E1126 13:55:17.165918 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:17 crc kubenswrapper[5200]: E1126 13:55:17.167607 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:17 crc kubenswrapper[5200]: E1126 13:55:17.167703 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" probeResult="unknown" Nov 26 13:55:22 crc kubenswrapper[5200]: E1126 13:55:22.087187 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:22 crc kubenswrapper[5200]: E1126 13:55:22.087488 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:22 crc kubenswrapper[5200]: E1126 13:55:22.087762 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:22 crc kubenswrapper[5200]: E1126 13:55:22.087795 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" probeResult="unknown" Nov 26 13:55:22 crc kubenswrapper[5200]: E1126 13:55:22.158101 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:22 crc kubenswrapper[5200]: E1126 13:55:22.159836 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:22 crc kubenswrapper[5200]: E1126 13:55:22.161223 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:22 crc kubenswrapper[5200]: E1126 13:55:22.161264 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" probeResult="unknown" Nov 26 13:55:23 crc kubenswrapper[5200]: I1126 13:55:23.806823 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/neutron-747455c75f-8zrqn" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": dial tcp 10.217.0.160:9696: connect: connection refused" Nov 26 13:55:24 crc kubenswrapper[5200]: E1126 13:55:24.720369 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:24 crc kubenswrapper[5200]: E1126 13:55:24.721774 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts podName:225d875d-9c0b-4acd-9305-b5cd82ea9056 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:40.721730029 +0000 UTC m=+1509.400931887 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts") pod "novaapi5c5d-account-delete-n94tm" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056") : configmap "openstack-scripts" not found Nov 26 13:55:24 crc kubenswrapper[5200]: E1126 13:55:24.720375 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:24 crc kubenswrapper[5200]: E1126 13:55:24.722962 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts podName:6d529441-c1c2-4f95-bc63-ce522b9f1b07 nodeName:}" failed. No retries permitted until 2025-11-26 13:55:40.722934 +0000 UTC m=+1509.402135858 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts") pod "novacell0e30c-account-delete-pnqrp" (UID: "6d529441-c1c2-4f95-bc63-ce522b9f1b07") : configmap "openstack-scripts" not found Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.483812 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.553249 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-config\") pod \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.553333 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-combined-ca-bundle\") pod \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.553393 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-ovndb-tls-certs\") pod \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.553516 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-internal-tls-certs\") pod \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.553561 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-public-tls-certs\") pod \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.553584 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x6m5\" (UniqueName: \"kubernetes.io/projected/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-kube-api-access-8x6m5\") pod \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.554288 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-httpd-config\") pod \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\" (UID: \"c3f6e95d-1b68-4641-a4e5-6bd172c0662e\") " Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.563149 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c3f6e95d-1b68-4641-a4e5-6bd172c0662e" (UID: "c3f6e95d-1b68-4641-a4e5-6bd172c0662e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.566196 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-kube-api-access-8x6m5" (OuterVolumeSpecName: "kube-api-access-8x6m5") pod "c3f6e95d-1b68-4641-a4e5-6bd172c0662e" (UID: "c3f6e95d-1b68-4641-a4e5-6bd172c0662e"). InnerVolumeSpecName "kube-api-access-8x6m5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.605234 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c3f6e95d-1b68-4641-a4e5-6bd172c0662e" (UID: "c3f6e95d-1b68-4641-a4e5-6bd172c0662e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.606978 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c3f6e95d-1b68-4641-a4e5-6bd172c0662e" (UID: "c3f6e95d-1b68-4641-a4e5-6bd172c0662e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.607985 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3f6e95d-1b68-4641-a4e5-6bd172c0662e" (UID: "c3f6e95d-1b68-4641-a4e5-6bd172c0662e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.630348 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-config" (OuterVolumeSpecName: "config") pod "c3f6e95d-1b68-4641-a4e5-6bd172c0662e" (UID: "c3f6e95d-1b68-4641-a4e5-6bd172c0662e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.644563 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c3f6e95d-1b68-4641-a4e5-6bd172c0662e" (UID: "c3f6e95d-1b68-4641-a4e5-6bd172c0662e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.656571 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.656608 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.656624 5200 reconciler_common.go:299] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.656636 5200 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.656648 5200 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.656660 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8x6m5\" (UniqueName: \"kubernetes.io/projected/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-kube-api-access-8x6m5\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.656675 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f6e95d-1b68-4641-a4e5-6bd172c0662e-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.695463 5200 generic.go:358] "Generic (PLEG): container finished" podID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerID="d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb" exitCode=0 Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.695575 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747455c75f-8zrqn" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.695589 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747455c75f-8zrqn" event={"ID":"c3f6e95d-1b68-4641-a4e5-6bd172c0662e","Type":"ContainerDied","Data":"d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb"} Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.695642 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747455c75f-8zrqn" event={"ID":"c3f6e95d-1b68-4641-a4e5-6bd172c0662e","Type":"ContainerDied","Data":"349ff0104322cc4005bff409f30124aa60c888205b3b411fd3fe645f50aa4258"} Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.695663 5200 scope.go:117] "RemoveContainer" containerID="50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.739098 5200 scope.go:117] "RemoveContainer" containerID="d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.741831 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-747455c75f-8zrqn"] Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.748575 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-747455c75f-8zrqn"] Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.760210 5200 scope.go:117] "RemoveContainer" containerID="50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646" Nov 26 13:55:26 crc kubenswrapper[5200]: E1126 13:55:26.760644 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646\": container with ID starting with 50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646 not found: ID does not exist" containerID="50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.760703 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646"} err="failed to get container status \"50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646\": rpc error: code = NotFound desc = could not find container \"50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646\": container with ID starting with 50cc0c8a2ea9c89d8d53d94677d3a411ca03c6f5439c51aa599192421e54c646 not found: ID does not exist" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.760734 5200 scope.go:117] "RemoveContainer" containerID="d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb" Nov 26 13:55:26 crc kubenswrapper[5200]: E1126 13:55:26.760977 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb\": container with ID starting with d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb not found: ID does not exist" containerID="d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb" Nov 26 13:55:26 crc kubenswrapper[5200]: I1126 13:55:26.761009 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb"} err="failed to get container status \"d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb\": rpc error: code = NotFound desc = could not find container \"d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb\": container with ID starting with d10f2c7aeb7c9b0244f48bf6dd05a75f7ce25ba808af357f7d17b884b96a18bb not found: ID does not exist" Nov 26 13:55:27 crc kubenswrapper[5200]: I1126 13:55:27.010328 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" path="/var/lib/kubelet/pods/c3f6e95d-1b68-4641-a4e5-6bd172c0662e/volumes" Nov 26 13:55:27 crc kubenswrapper[5200]: E1126 13:55:27.087358 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:27 crc kubenswrapper[5200]: E1126 13:55:27.088142 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:27 crc kubenswrapper[5200]: E1126 13:55:27.088559 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:27 crc kubenswrapper[5200]: E1126 13:55:27.088629 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" probeResult="unknown" Nov 26 13:55:27 crc kubenswrapper[5200]: E1126 13:55:27.157918 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:27 crc kubenswrapper[5200]: E1126 13:55:27.159738 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:27 crc kubenswrapper[5200]: E1126 13:55:27.161893 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:27 crc kubenswrapper[5200]: E1126 13:55:27.161950 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" probeResult="unknown" Nov 26 13:55:32 crc kubenswrapper[5200]: E1126 13:55:32.086734 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:32 crc kubenswrapper[5200]: E1126 13:55:32.087992 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:32 crc kubenswrapper[5200]: E1126 13:55:32.088593 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 26 13:55:32 crc kubenswrapper[5200]: E1126 13:55:32.088634 5200 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" probeResult="unknown" Nov 26 13:55:32 crc kubenswrapper[5200]: E1126 13:55:32.158794 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:32 crc kubenswrapper[5200]: E1126 13:55:32.160349 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:32 crc kubenswrapper[5200]: E1126 13:55:32.162220 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 26 13:55:32 crc kubenswrapper[5200]: E1126 13:55:32.162256 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qk25n" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" probeResult="unknown" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.773549 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qk25n_257070be-da5f-4601-b466-b9f7c8c8f153/ovs-vswitchd/0.log" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.774207 5200 generic.go:358] "Generic (PLEG): container finished" podID="257070be-da5f-4601-b466-b9f7c8c8f153" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" exitCode=137 Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.774635 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qk25n" event={"ID":"257070be-da5f-4601-b466-b9f7c8c8f153","Type":"ContainerDied","Data":"7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a"} Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.774709 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qk25n" event={"ID":"257070be-da5f-4601-b466-b9f7c8c8f153","Type":"ContainerDied","Data":"c03d27d3370349ea15d3c4f499299d5e80b62d8c1de43874581cb87ebcce774e"} Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.774723 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03d27d3370349ea15d3c4f499299d5e80b62d8c1de43874581cb87ebcce774e" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.796856 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qk25n_257070be-da5f-4601-b466-b9f7c8c8f153/ovs-vswitchd/0.log" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.799087 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.896928 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n6l7\" (UniqueName: \"kubernetes.io/projected/257070be-da5f-4601-b466-b9f7c8c8f153-kube-api-access-7n6l7\") pod \"257070be-da5f-4601-b466-b9f7c8c8f153\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897021 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-run\") pod \"257070be-da5f-4601-b466-b9f7c8c8f153\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897098 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/257070be-da5f-4601-b466-b9f7c8c8f153-scripts\") pod \"257070be-da5f-4601-b466-b9f7c8c8f153\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897163 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-run" (OuterVolumeSpecName: "var-run") pod "257070be-da5f-4601-b466-b9f7c8c8f153" (UID: "257070be-da5f-4601-b466-b9f7c8c8f153"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897234 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-log\") pod \"257070be-da5f-4601-b466-b9f7c8c8f153\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897262 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-log" (OuterVolumeSpecName: "var-log") pod "257070be-da5f-4601-b466-b9f7c8c8f153" (UID: "257070be-da5f-4601-b466-b9f7c8c8f153"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897340 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-etc-ovs\") pod \"257070be-da5f-4601-b466-b9f7c8c8f153\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897400 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "257070be-da5f-4601-b466-b9f7c8c8f153" (UID: "257070be-da5f-4601-b466-b9f7c8c8f153"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897430 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-lib\") pod \"257070be-da5f-4601-b466-b9f7c8c8f153\" (UID: \"257070be-da5f-4601-b466-b9f7c8c8f153\") " Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897518 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-lib" (OuterVolumeSpecName: "var-lib") pod "257070be-da5f-4601-b466-b9f7c8c8f153" (UID: "257070be-da5f-4601-b466-b9f7c8c8f153"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897749 5200 reconciler_common.go:299] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-etc-ovs\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897764 5200 reconciler_common.go:299] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-lib\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897772 5200 reconciler_common.go:299] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-run\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.897782 5200 reconciler_common.go:299] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/257070be-da5f-4601-b466-b9f7c8c8f153-var-log\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.898313 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257070be-da5f-4601-b466-b9f7c8c8f153-scripts" (OuterVolumeSpecName: "scripts") pod "257070be-da5f-4601-b466-b9f7c8c8f153" (UID: "257070be-da5f-4601-b466-b9f7c8c8f153"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:33 crc kubenswrapper[5200]: I1126 13:55:33.903604 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257070be-da5f-4601-b466-b9f7c8c8f153-kube-api-access-7n6l7" (OuterVolumeSpecName: "kube-api-access-7n6l7") pod "257070be-da5f-4601-b466-b9f7c8c8f153" (UID: "257070be-da5f-4601-b466-b9f7c8c8f153"). InnerVolumeSpecName "kube-api-access-7n6l7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.001111 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7n6l7\" (UniqueName: \"kubernetes.io/projected/257070be-da5f-4601-b466-b9f7c8c8f153-kube-api-access-7n6l7\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.001158 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/257070be-da5f-4601-b466-b9f7c8c8f153-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.792962 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerID="d6a3d13e69838eb9d4252d0286848176a74b5a51be636cf9579e83eb743b5064" exitCode=137 Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.793027 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"d6a3d13e69838eb9d4252d0286848176a74b5a51be636cf9579e83eb743b5064"} Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.794074 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qk25n" Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.843350 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-qk25n"] Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.874152 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-qk25n"] Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.950028 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 13:55:34 crc kubenswrapper[5200]: I1126 13:55:34.982102 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" path="/var/lib/kubelet/pods/257070be-da5f-4601-b466-b9f7c8c8f153/volumes" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.018201 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9e8c608b-5148-492a-a8a7-b8eccc586561\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.018467 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-cache\") pod \"9e8c608b-5148-492a-a8a7-b8eccc586561\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.018525 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-lock\") pod \"9e8c608b-5148-492a-a8a7-b8eccc586561\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.018549 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq8rq\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-kube-api-access-pq8rq\") pod \"9e8c608b-5148-492a-a8a7-b8eccc586561\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.018600 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") pod \"9e8c608b-5148-492a-a8a7-b8eccc586561\" (UID: \"9e8c608b-5148-492a-a8a7-b8eccc586561\") " Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.018950 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-cache" (OuterVolumeSpecName: "cache") pod "9e8c608b-5148-492a-a8a7-b8eccc586561" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.019836 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-lock" (OuterVolumeSpecName: "lock") pod "9e8c608b-5148-492a-a8a7-b8eccc586561" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.024364 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "9e8c608b-5148-492a-a8a7-b8eccc586561" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.024433 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9e8c608b-5148-492a-a8a7-b8eccc586561" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.033579 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-kube-api-access-pq8rq" (OuterVolumeSpecName: "kube-api-access-pq8rq") pod "9e8c608b-5148-492a-a8a7-b8eccc586561" (UID: "9e8c608b-5148-492a-a8a7-b8eccc586561"). InnerVolumeSpecName "kube-api-access-pq8rq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.120034 5200 reconciler_common.go:299] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-lock\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.120073 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pq8rq\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-kube-api-access-pq8rq\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.120083 5200 reconciler_common.go:299] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e8c608b-5148-492a-a8a7-b8eccc586561-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.120111 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.120121 5200 reconciler_common.go:299] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9e8c608b-5148-492a-a8a7-b8eccc586561-cache\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.137816 5200 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.145182 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qk25n_257070be-da5f-4601-b466-b9f7c8c8f153/ovs-vswitchd/0.log" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.158774 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qk25n_257070be-da5f-4601-b466-b9f7c8c8f153/ovs-vswitchd/0.log" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.221149 5200 reconciler_common.go:299] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.223944 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.230587 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.233865 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.238895 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.820291 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9e8c608b-5148-492a-a8a7-b8eccc586561","Type":"ContainerDied","Data":"48497cc22cc886617125fd444649afd27364fb12110d4bab92a3a69eb60bd8b4"} Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.820379 5200 scope.go:117] "RemoveContainer" containerID="d6a3d13e69838eb9d4252d0286848176a74b5a51be636cf9579e83eb743b5064" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.820878 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.862936 5200 scope.go:117] "RemoveContainer" containerID="8a3c5bcf0c2ae9a26e17f253d1e64688670620c4dc1055a422aca31fb19825c3" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.901076 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.908437 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.917300 5200 scope.go:117] "RemoveContainer" containerID="87a7e0f4bc1babfa0daa5d1e1ca5980686a23a89b61b6d2d33f6849a4eb58074" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.949622 5200 scope.go:117] "RemoveContainer" containerID="e400c5c5cf3eebf7e008c823d4b2464e8e01eeefe341a90c08ccbb54d5c9bc15" Nov 26 13:55:35 crc kubenswrapper[5200]: I1126 13:55:35.976028 5200 scope.go:117] "RemoveContainer" containerID="3bfc3a40523ed141ac9ba98103ae3054008e91cd9185bc26c4c466d7974e3c46" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.002350 5200 scope.go:117] "RemoveContainer" containerID="5c2fe7129b5e32d44a73bbe4f844f7bea44dd368fbaba824bda4a249bc55166e" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.031133 5200 scope.go:117] "RemoveContainer" containerID="f63f390a689d7edab2c7f57aaab9b53c11b9ab447c109328c91f7c6d40c5f5ec" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.062614 5200 scope.go:117] "RemoveContainer" containerID="f5ee039fbe62517040d20e45c50d6d689b188548be0f48205093fbd0376ceb73" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.086869 5200 scope.go:117] "RemoveContainer" containerID="73f91d32af16b5469f8f5d9eeb6ad5b3a0299f66553fb69a26e37fd1e4724334" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.111486 5200 scope.go:117] "RemoveContainer" containerID="2ab0a7853e519d877df1c22e5ebeb99b1c64c7a449e720446cdb4ecc46d44e12" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.137224 5200 scope.go:117] "RemoveContainer" containerID="8b51fec0244bcdc2b862bc7cdd69319c60e9093b1b1eee3641c5eec3c4239828" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.156293 5200 scope.go:117] "RemoveContainer" containerID="42f52ac7a2a3d8ce2846507f56c360127f1cd001844f1aa503e54d119743673b" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.177270 5200 scope.go:117] "RemoveContainer" containerID="2eb02dc54fa136e7c5025e9634090a7041d9069cad341af2a5616acb5c290480" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.201660 5200 scope.go:117] "RemoveContainer" containerID="87378e41d86dff6f16d62f1a53be1da15948d42fc9a54646c7077b0037089474" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.230041 5200 scope.go:117] "RemoveContainer" containerID="903ac30a9dbec48f735e990bea93a6806f1762703320ed6711cbac6fbff79c06" Nov 26 13:55:36 crc kubenswrapper[5200]: I1126 13:55:36.991001 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" path="/var/lib/kubelet/pods/9e8c608b-5148-492a-a8a7-b8eccc586561/volumes" Nov 26 13:55:37 crc kubenswrapper[5200]: I1126 13:55:37.660419 5200 pod_container_manager_linux.go:217] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod8238a7a4-ddcb-4cc8-ac43-72042e4c4f50"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod8238a7a4-ddcb-4cc8-ac43-72042e4c4f50] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8238a7a4_ddcb_4cc8_ac43_72042e4c4f50.slice" Nov 26 13:55:37 crc kubenswrapper[5200]: E1126 13:55:37.660847 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod8238a7a4-ddcb-4cc8-ac43-72042e4c4f50] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod8238a7a4-ddcb-4cc8-ac43-72042e4c4f50] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8238a7a4_ddcb_4cc8_ac43_72042e4c4f50.slice" pod="openstack/ovsdbserver-nb-0" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" Nov 26 13:55:37 crc kubenswrapper[5200]: I1126 13:55:37.844998 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 13:55:37 crc kubenswrapper[5200]: I1126 13:55:37.900504 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 13:55:37 crc kubenswrapper[5200]: I1126 13:55:37.906063 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 13:55:38 crc kubenswrapper[5200]: I1126 13:55:38.990750 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8238a7a4-ddcb-4cc8-ac43-72042e4c4f50" path="/var/lib/kubelet/pods/8238a7a4-ddcb-4cc8-ac43-72042e4c4f50/volumes" Nov 26 13:55:39 crc kubenswrapper[5200]: I1126 13:55:39.648771 5200 pod_container_manager_linux.go:217] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podff2e56e0-4f2e-4101-b051-111cb2cbee56"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podff2e56e0-4f2e-4101-b051-111cb2cbee56] : Timed out while waiting for systemd to remove kubepods-besteffort-podff2e56e0_4f2e_4101_b051_111cb2cbee56.slice" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.491026 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.612379 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts\") pod \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\" (UID: \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\") " Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.612854 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr7wh\" (UniqueName: \"kubernetes.io/projected/6d529441-c1c2-4f95-bc63-ce522b9f1b07-kube-api-access-vr7wh\") pod \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\" (UID: \"6d529441-c1c2-4f95-bc63-ce522b9f1b07\") " Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.613483 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d529441-c1c2-4f95-bc63-ce522b9f1b07" (UID: "6d529441-c1c2-4f95-bc63-ce522b9f1b07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.620897 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d529441-c1c2-4f95-bc63-ce522b9f1b07-kube-api-access-vr7wh" (OuterVolumeSpecName: "kube-api-access-vr7wh") pod "6d529441-c1c2-4f95-bc63-ce522b9f1b07" (UID: "6d529441-c1c2-4f95-bc63-ce522b9f1b07"). InnerVolumeSpecName "kube-api-access-vr7wh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.714958 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d529441-c1c2-4f95-bc63-ce522b9f1b07-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.714991 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vr7wh\" (UniqueName: \"kubernetes.io/projected/6d529441-c1c2-4f95-bc63-ce522b9f1b07-kube-api-access-vr7wh\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:40 crc kubenswrapper[5200]: E1126 13:55:40.817042 5200 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 26 13:55:40 crc kubenswrapper[5200]: E1126 13:55:40.817157 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts podName:225d875d-9c0b-4acd-9305-b5cd82ea9056 nodeName:}" failed. No retries permitted until 2025-11-26 13:56:12.817132219 +0000 UTC m=+1541.496334037 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts") pod "novaapi5c5d-account-delete-n94tm" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056") : configmap "openstack-scripts" not found Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.879676 5200 generic.go:358] "Generic (PLEG): container finished" podID="6d529441-c1c2-4f95-bc63-ce522b9f1b07" containerID="4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599" exitCode=137 Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.879772 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e30c-account-delete-pnqrp" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.879798 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e30c-account-delete-pnqrp" event={"ID":"6d529441-c1c2-4f95-bc63-ce522b9f1b07","Type":"ContainerDied","Data":"4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599"} Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.879842 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e30c-account-delete-pnqrp" event={"ID":"6d529441-c1c2-4f95-bc63-ce522b9f1b07","Type":"ContainerDied","Data":"55face2d87f37ebc312073e998bdd0f880495858b9987731454264575e8ed7de"} Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.879863 5200 scope.go:117] "RemoveContainer" containerID="4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.914570 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/novacell0e30c-account-delete-pnqrp"] Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.915285 5200 scope.go:117] "RemoveContainer" containerID="4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599" Nov 26 13:55:40 crc kubenswrapper[5200]: E1126 13:55:40.915905 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599\": container with ID starting with 4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599 not found: ID does not exist" containerID="4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.915957 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599"} err="failed to get container status \"4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599\": rpc error: code = NotFound desc = could not find container \"4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599\": container with ID starting with 4be6fc686b29d52ed3a76840540806240efb83da19e02dc33e3044b3acc10599 not found: ID does not exist" Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.920177 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0e30c-account-delete-pnqrp"] Nov 26 13:55:40 crc kubenswrapper[5200]: I1126 13:55:40.980465 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d529441-c1c2-4f95-bc63-ce522b9f1b07" path="/var/lib/kubelet/pods/6d529441-c1c2-4f95-bc63-ce522b9f1b07/volumes" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.578672 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.732923 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62496\" (UniqueName: \"kubernetes.io/projected/225d875d-9c0b-4acd-9305-b5cd82ea9056-kube-api-access-62496\") pod \"225d875d-9c0b-4acd-9305-b5cd82ea9056\" (UID: \"225d875d-9c0b-4acd-9305-b5cd82ea9056\") " Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.733478 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts\") pod \"225d875d-9c0b-4acd-9305-b5cd82ea9056\" (UID: \"225d875d-9c0b-4acd-9305-b5cd82ea9056\") " Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.734693 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "225d875d-9c0b-4acd-9305-b5cd82ea9056" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.735392 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/225d875d-9c0b-4acd-9305-b5cd82ea9056-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.739132 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225d875d-9c0b-4acd-9305-b5cd82ea9056-kube-api-access-62496" (OuterVolumeSpecName: "kube-api-access-62496") pod "225d875d-9c0b-4acd-9305-b5cd82ea9056" (UID: "225d875d-9c0b-4acd-9305-b5cd82ea9056"). InnerVolumeSpecName "kube-api-access-62496". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.836506 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-62496\" (UniqueName: \"kubernetes.io/projected/225d875d-9c0b-4acd-9305-b5cd82ea9056-kube-api-access-62496\") on node \"crc\" DevicePath \"\"" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.908817 5200 generic.go:358] "Generic (PLEG): container finished" podID="225d875d-9c0b-4acd-9305-b5cd82ea9056" containerID="7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718" exitCode=137 Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.908909 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi5c5d-account-delete-n94tm" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.908956 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi5c5d-account-delete-n94tm" event={"ID":"225d875d-9c0b-4acd-9305-b5cd82ea9056","Type":"ContainerDied","Data":"7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718"} Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.909014 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi5c5d-account-delete-n94tm" event={"ID":"225d875d-9c0b-4acd-9305-b5cd82ea9056","Type":"ContainerDied","Data":"7c198d7cc3b7d48e26594850911ce3af2caccafa8ecf629c34e9c3402a122711"} Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.909039 5200 scope.go:117] "RemoveContainer" containerID="7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.963090 5200 scope.go:117] "RemoveContainer" containerID="7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718" Nov 26 13:55:41 crc kubenswrapper[5200]: E1126 13:55:41.963597 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718\": container with ID starting with 7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718 not found: ID does not exist" containerID="7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.963645 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718"} err="failed to get container status \"7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718\": rpc error: code = NotFound desc = could not find container \"7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718\": container with ID starting with 7a9e13894c22d152ad816efa4b1947a0737ada3d3c1082cff4ecd8c19d064718 not found: ID does not exist" Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.967355 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/novaapi5c5d-account-delete-n94tm"] Nov 26 13:55:41 crc kubenswrapper[5200]: I1126 13:55:41.981398 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi5c5d-account-delete-n94tm"] Nov 26 13:55:42 crc kubenswrapper[5200]: E1126 13:55:42.056379 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1177056 actualBytes=10240 Nov 26 13:55:42 crc kubenswrapper[5200]: I1126 13:55:42.989235 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225d875d-9c0b-4acd-9305-b5cd82ea9056" path="/var/lib/kubelet/pods/225d875d-9c0b-4acd-9305-b5cd82ea9056/volumes" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.551960 5200 scope.go:117] "RemoveContainer" containerID="67c84e57c9be8a0f520564f094ee2c05a7164becc48de237903183acadf66bf9" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.593412 5200 scope.go:117] "RemoveContainer" containerID="f33740e7d3adadc613cd9ce218aeb6d42b611cbf434cfa017559867f048dc3ab" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.662895 5200 scope.go:117] "RemoveContainer" containerID="7109e9da47f7080d9ecd3d5f3703e4f8c86eeb23cecde84a94b7c73aba425122" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.688511 5200 scope.go:117] "RemoveContainer" containerID="f6d760adf06e8495fc5adf371d93425f87ebb578c30594484a1a0242dbb0161b" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.728629 5200 scope.go:117] "RemoveContainer" containerID="0158049cf93b394d6d7796f9c468a05c7b8d8d0570e52f2d446a83652427fa8c" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.752416 5200 scope.go:117] "RemoveContainer" containerID="dcdcd6e1ce754876345a56277736b1613ed38c8f5c5d0d53a8279dd09e133bb0" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.806999 5200 scope.go:117] "RemoveContainer" containerID="7bb16d3a8f9b0cf1fc4e509932320bf940df760e4ccedec11d61d40970672efa" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.829658 5200 scope.go:117] "RemoveContainer" containerID="41da013d26125d68b1a8509a400bc2f1b9e594ecbaa3ee72827d094ece021809" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.852214 5200 scope.go:117] "RemoveContainer" containerID="7a8141c0c332f8af8b48e4fd35e84bf7b0c089845033f6d2d80786d4daaee498" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.872551 5200 scope.go:117] "RemoveContainer" containerID="7a392c174599e7a3f2dd9b8b60d26c24a0c73afb117b34e1bb48a13a05f37f8a" Nov 26 13:56:22 crc kubenswrapper[5200]: I1126 13:56:22.890800 5200 scope.go:117] "RemoveContainer" containerID="8eb763e1a24e243fecca1c957e0644829929333388f6e4d32ab9411ef66f8cf1" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.844530 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rm57z"] Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846120 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846146 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846167 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server-init" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846175 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server-init" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846185 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c5390ed-d89e-4450-adf3-1363b1150d1b" containerName="rabbitmq" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846194 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5390ed-d89e-4450-adf3-1363b1150d1b" containerName="rabbitmq" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846206 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846213 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846224 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846231 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846246 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="743192ea-7611-42dc-ab35-0dd584590479" containerName="ovn-controller" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846254 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="743192ea-7611-42dc-ab35-0dd584590479" containerName="ovn-controller" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846265 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-updater" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846272 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-updater" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846284 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="proxy-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846291 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="proxy-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846302 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846310 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846324 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846331 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846344 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846352 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846364 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="ceilometer-central-agent" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846371 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="ceilometer-central-agent" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846383 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846390 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846418 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="225d875d-9c0b-4acd-9305-b5cd82ea9056" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846426 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="225d875d-9c0b-4acd-9305-b5cd82ea9056" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846434 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" containerName="nova-cell0-conductor-conductor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846459 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" containerName="nova-cell0-conductor-conductor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846470 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-expirer" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846478 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-expirer" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846491 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="rsync" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846498 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="rsync" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846506 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" containerName="barbican-keystone-listener-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846513 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" containerName="barbican-keystone-listener-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846525 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27a62b3a-c69a-47fa-9642-425931d81827" containerName="galera" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846532 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a62b3a-c69a-47fa-9642-425931d81827" containerName="galera" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846539 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aae8bd0a-01d7-497d-b7fb-26c495823af7" containerName="nova-cell1-conductor-conductor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846546 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae8bd0a-01d7-497d-b7fb-26c495823af7" containerName="nova-cell1-conductor-conductor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846557 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846564 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846572 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846578 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846590 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846598 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846609 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846617 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846632 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846639 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846650 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8db404a7-2c7c-4a0c-b21c-570bb473b1ff" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846657 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db404a7-2c7c-4a0c-b21c-570bb473b1ff" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846667 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846674 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846692 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="caee6c82-a418-4a1d-a364-0a0eb9073c09" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846700 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="caee6c82-a418-4a1d-a364-0a0eb9073c09" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846711 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846735 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846748 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-updater" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846756 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-updater" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846767 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="ceilometer-notification-agent" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846774 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="ceilometer-notification-agent" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846785 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="sg-core" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846792 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="sg-core" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846806 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846814 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846821 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" containerName="kube-state-metrics" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846829 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" containerName="kube-state-metrics" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846839 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846847 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846868 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fdc5a98-ccb9-4059-ba29-729ad8af9c4f" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846875 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdc5a98-ccb9-4059-ba29-729ad8af9c4f" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846885 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c5390ed-d89e-4450-adf3-1363b1150d1b" containerName="setup-container" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846892 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5390ed-d89e-4450-adf3-1363b1150d1b" containerName="setup-container" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846905 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-metadata" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846913 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-metadata" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846927 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27a62b3a-c69a-47fa-9642-425931d81827" containerName="mysql-bootstrap" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846934 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a62b3a-c69a-47fa-9642-425931d81827" containerName="mysql-bootstrap" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846952 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846960 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846970 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846977 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846986 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-reaper" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846992 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-reaper" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.846999 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" containerName="barbican-keystone-listener" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847007 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" containerName="barbican-keystone-listener" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847016 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05534089-21c2-4d8c-9b9e-d21c7082e727" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847022 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="05534089-21c2-4d8c-9b9e-d21c7082e727" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847033 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerName="barbican-worker" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847042 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerName="barbican-worker" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847049 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d529441-c1c2-4f95-bc63-ce522b9f1b07" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847056 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d529441-c1c2-4f95-bc63-ce522b9f1b07" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847075 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847082 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847096 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="openstack-network-exporter" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847103 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="openstack-network-exporter" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847115 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab255459-89c5-462e-a89d-eb26ba377f79" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847123 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab255459-89c5-462e-a89d-eb26ba377f79" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847138 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847144 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847161 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55e57b83-77f4-446e-bc0a-c78f9094aacb" containerName="memcached" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847169 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e57b83-77f4-446e-bc0a-c78f9094aacb" containerName="memcached" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847207 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847214 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847223 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847230 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847242 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11e2f1eb-442a-4f5c-845c-1ecf10744fd1" containerName="keystone-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847265 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e2f1eb-442a-4f5c-845c-1ecf10744fd1" containerName="keystone-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847276 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="swift-recon-cron" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847284 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="swift-recon-cron" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847297 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" containerName="setup-container" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847305 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" containerName="setup-container" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847316 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847324 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847337 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847344 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847354 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adab4e53-c668-44cf-be5d-5802a77124aa" containerName="nova-scheduler-scheduler" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847363 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="adab4e53-c668-44cf-be5d-5802a77124aa" containerName="nova-scheduler-scheduler" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847381 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerName="barbican-worker-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847388 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerName="barbican-worker-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847395 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847402 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847412 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847420 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847431 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" containerName="rabbitmq" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847439 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" containerName="rabbitmq" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847452 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="ovn-northd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847459 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="ovn-northd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847620 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847637 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847645 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847660 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847696 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847712 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8db404a7-2c7c-4a0c-b21c-570bb473b1ff" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847743 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fdc5a98-ccb9-4059-ba29-729ad8af9c4f" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847753 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="rsync" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847765 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" containerName="barbican-keystone-listener" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847774 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="eab77ac1-a779-46ae-b0f6-7d39a99d1ecb" containerName="kube-state-metrics" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847786 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847796 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="caee6c82-a418-4a1d-a364-0a0eb9073c09" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847806 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847814 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-metadata" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847827 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerName="barbican-worker-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847835 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="11e2f1eb-442a-4f5c-845c-1ecf10744fd1" containerName="keystone-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847845 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovs-vswitchd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847858 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="55e57b83-77f4-446e-bc0a-c78f9094aacb" containerName="memcached" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847867 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8484d8df-fa24-4e8e-85ba-c318fc6cde61" containerName="nova-metadata-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847889 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-expirer" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847901 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="adab4e53-c668-44cf-be5d-5802a77124aa" containerName="nova-scheduler-scheduler" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847912 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847920 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847930 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="743192ea-7611-42dc-ab35-0dd584590479" containerName="ovn-controller" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847937 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="27a62b3a-c69a-47fa-9642-425931d81827" containerName="galera" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847948 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847957 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="789d54c5-fbf5-4277-b8f7-6f1c4a7a6f59" containerName="barbican-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847966 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="ceilometer-notification-agent" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847976 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847983 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bdf37de-cbcd-4971-a688-f3b91cf98b47" containerName="glance-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.847994 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8edf3f05-24ba-41fa-82bd-3a4e57a1cee6" containerName="nova-cell0-conductor-conductor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848006 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="aae8bd0a-01d7-497d-b7fb-26c495823af7" containerName="nova-cell1-conductor-conductor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848015 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-reaper" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848023 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-replicator" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848035 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f6817f6-ce07-4b0f-a6c8-d667faa003a8" containerName="placement-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848046 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-updater" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848053 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="proxy-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848065 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c5390ed-d89e-4450-adf3-1363b1150d1b" containerName="rabbitmq" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848077 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="account-auditor" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848086 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="sg-core" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848093 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="object-updater" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848104 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="257070be-da5f-4601-b466-b9f7c8c8f153" containerName="ovsdb-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848114 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848122 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="225d875d-9c0b-4acd-9305-b5cd82ea9056" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848129 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="container-server" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848150 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2c7ebc3-a1ef-4244-8fc8-a46aca7a1ee0" containerName="nova-api-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848158 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab255459-89c5-462e-a89d-eb26ba377f79" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848167 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="edfe6fae-0a04-415d-8cd7-ddf1dddf7d31" containerName="ceilometer-central-agent" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848178 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848187 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f5ea8d4-ba2b-4abe-b645-0513268421f9" containerName="rabbitmq" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848195 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8c608b-5148-492a-a8a7-b8eccc586561" containerName="swift-recon-cron" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848206 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dd57521-dfcc-4303-9f37-3485c7407851" containerName="barbican-keystone-listener-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848214 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d529441-c1c2-4f95-bc63-ce522b9f1b07" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848222 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="openstack-network-exporter" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848413 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a85e23d-55dd-4d11-b304-aeb59804f47d" containerName="barbican-worker" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848423 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="12479106-7c82-418c-b017-03e5fcc4f018" containerName="ovn-northd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848431 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3f6e95d-1b68-4641-a4e5-6bd172c0662e" containerName="neutron-api" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848441 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="046ac75d-4cfe-40ed-aca5-a5875ea641d8" containerName="glance-httpd" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848450 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff2e56e0-4f2e-4101-b051-111cb2cbee56" containerName="cinder-api-log" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.848458 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="05534089-21c2-4d8c-9b9e-d21c7082e727" containerName="mariadb-account-delete" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.871538 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rm57z"] Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.871863 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.998865 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65j95\" (UniqueName: \"kubernetes.io/projected/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-kube-api-access-65j95\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.999026 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-utilities\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:24 crc kubenswrapper[5200]: I1126 13:56:24.999281 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-catalog-content\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.101027 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-utilities\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.101120 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-catalog-content\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.101169 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65j95\" (UniqueName: \"kubernetes.io/projected/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-kube-api-access-65j95\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.101613 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-utilities\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.101663 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-catalog-content\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.131995 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65j95\" (UniqueName: \"kubernetes.io/projected/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-kube-api-access-65j95\") pod \"community-operators-rm57z\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.191547 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.477469 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rm57z"] Nov 26 13:56:25 crc kubenswrapper[5200]: I1126 13:56:25.485849 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 13:56:26 crc kubenswrapper[5200]: I1126 13:56:26.422275 5200 generic.go:358] "Generic (PLEG): container finished" podID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerID="56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee" exitCode=0 Nov 26 13:56:26 crc kubenswrapper[5200]: I1126 13:56:26.422934 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm57z" event={"ID":"a51b64b2-a340-4f43-81fd-8ce68f9c69d7","Type":"ContainerDied","Data":"56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee"} Nov 26 13:56:26 crc kubenswrapper[5200]: I1126 13:56:26.423018 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm57z" event={"ID":"a51b64b2-a340-4f43-81fd-8ce68f9c69d7","Type":"ContainerStarted","Data":"9d13557b8cd372ab371599917283152f484993da2484ec94c20596a346f41162"} Nov 26 13:56:28 crc kubenswrapper[5200]: I1126 13:56:28.447573 5200 generic.go:358] "Generic (PLEG): container finished" podID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerID="89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673" exitCode=0 Nov 26 13:56:28 crc kubenswrapper[5200]: I1126 13:56:28.447667 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm57z" event={"ID":"a51b64b2-a340-4f43-81fd-8ce68f9c69d7","Type":"ContainerDied","Data":"89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673"} Nov 26 13:56:29 crc kubenswrapper[5200]: I1126 13:56:29.462259 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm57z" event={"ID":"a51b64b2-a340-4f43-81fd-8ce68f9c69d7","Type":"ContainerStarted","Data":"ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911"} Nov 26 13:56:29 crc kubenswrapper[5200]: I1126 13:56:29.485302 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rm57z" podStartSLOduration=4.55100804 podStartE2EDuration="5.485278063s" podCreationTimestamp="2025-11-26 13:56:24 +0000 UTC" firstStartedPulling="2025-11-26 13:56:26.423955836 +0000 UTC m=+1555.103157694" lastFinishedPulling="2025-11-26 13:56:27.358225889 +0000 UTC m=+1556.037427717" observedRunningTime="2025-11-26 13:56:29.480627342 +0000 UTC m=+1558.159829170" watchObservedRunningTime="2025-11-26 13:56:29.485278063 +0000 UTC m=+1558.164479891" Nov 26 13:56:33 crc kubenswrapper[5200]: I1126 13:56:33.154954 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:56:33 crc kubenswrapper[5200]: I1126 13:56:33.155420 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:56:35 crc kubenswrapper[5200]: I1126 13:56:35.192573 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:35 crc kubenswrapper[5200]: I1126 13:56:35.193054 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:35 crc kubenswrapper[5200]: I1126 13:56:35.257407 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:35 crc kubenswrapper[5200]: I1126 13:56:35.601068 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:35 crc kubenswrapper[5200]: I1126 13:56:35.670076 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rm57z"] Nov 26 13:56:37 crc kubenswrapper[5200]: I1126 13:56:37.546498 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rm57z" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerName="registry-server" containerID="cri-o://ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911" gracePeriod=2 Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.098351 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.151591 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65j95\" (UniqueName: \"kubernetes.io/projected/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-kube-api-access-65j95\") pod \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.151671 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-utilities\") pod \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.151723 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-catalog-content\") pod \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\" (UID: \"a51b64b2-a340-4f43-81fd-8ce68f9c69d7\") " Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.155295 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-utilities" (OuterVolumeSpecName: "utilities") pod "a51b64b2-a340-4f43-81fd-8ce68f9c69d7" (UID: "a51b64b2-a340-4f43-81fd-8ce68f9c69d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.162968 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-kube-api-access-65j95" (OuterVolumeSpecName: "kube-api-access-65j95") pod "a51b64b2-a340-4f43-81fd-8ce68f9c69d7" (UID: "a51b64b2-a340-4f43-81fd-8ce68f9c69d7"). InnerVolumeSpecName "kube-api-access-65j95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.253584 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65j95\" (UniqueName: \"kubernetes.io/projected/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-kube-api-access-65j95\") on node \"crc\" DevicePath \"\"" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.254066 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.565033 5200 generic.go:358] "Generic (PLEG): container finished" podID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerID="ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911" exitCode=0 Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.565165 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm57z" event={"ID":"a51b64b2-a340-4f43-81fd-8ce68f9c69d7","Type":"ContainerDied","Data":"ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911"} Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.565214 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rm57z" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.565279 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm57z" event={"ID":"a51b64b2-a340-4f43-81fd-8ce68f9c69d7","Type":"ContainerDied","Data":"9d13557b8cd372ab371599917283152f484993da2484ec94c20596a346f41162"} Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.565318 5200 scope.go:117] "RemoveContainer" containerID="ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.602374 5200 scope.go:117] "RemoveContainer" containerID="89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.642069 5200 scope.go:117] "RemoveContainer" containerID="56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.679013 5200 scope.go:117] "RemoveContainer" containerID="ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911" Nov 26 13:56:38 crc kubenswrapper[5200]: E1126 13:56:38.679661 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911\": container with ID starting with ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911 not found: ID does not exist" containerID="ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.679768 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911"} err="failed to get container status \"ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911\": rpc error: code = NotFound desc = could not find container \"ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911\": container with ID starting with ced36c10849f8d5932d270734b2dafc25a76211fef16c50f4110c45310ac0911 not found: ID does not exist" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.679812 5200 scope.go:117] "RemoveContainer" containerID="89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673" Nov 26 13:56:38 crc kubenswrapper[5200]: E1126 13:56:38.680305 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673\": container with ID starting with 89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673 not found: ID does not exist" containerID="89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.680338 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673"} err="failed to get container status \"89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673\": rpc error: code = NotFound desc = could not find container \"89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673\": container with ID starting with 89c7f809107277ede0dc6ed3d1634ef6f6440bd29b790511730264e3df6ed673 not found: ID does not exist" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.680363 5200 scope.go:117] "RemoveContainer" containerID="56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.680660 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a51b64b2-a340-4f43-81fd-8ce68f9c69d7" (UID: "a51b64b2-a340-4f43-81fd-8ce68f9c69d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 13:56:38 crc kubenswrapper[5200]: E1126 13:56:38.681015 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee\": container with ID starting with 56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee not found: ID does not exist" containerID="56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.681087 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee"} err="failed to get container status \"56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee\": rpc error: code = NotFound desc = could not find container \"56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee\": container with ID starting with 56cbf3c9b8bc2d9504c2047291c84e5edc7fb3f4a6862abc51162e6a1b6c12ee not found: ID does not exist" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.761524 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a51b64b2-a340-4f43-81fd-8ce68f9c69d7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.935906 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rm57z"] Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.950980 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rm57z"] Nov 26 13:56:38 crc kubenswrapper[5200]: I1126 13:56:38.998408 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" path="/var/lib/kubelet/pods/a51b64b2-a340-4f43-81fd-8ce68f9c69d7/volumes" Nov 26 13:56:41 crc kubenswrapper[5200]: E1126 13:56:41.951968 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167886 actualBytes=10240 Nov 26 13:57:03 crc kubenswrapper[5200]: I1126 13:57:03.154343 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:57:03 crc kubenswrapper[5200]: I1126 13:57:03.155134 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.141984 5200 scope.go:117] "RemoveContainer" containerID="9bc8f43279d7dbd4a5978975de7b27caa4051fc1687a7d4d38d84f361032231d" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.199157 5200 scope.go:117] "RemoveContainer" containerID="3bb0c257014ca67f7e212e58afdefd5fd1976e3c93c5241454d4d104f8967837" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.238890 5200 scope.go:117] "RemoveContainer" containerID="0d7570f872547bd971357e3ece39102ca1a3fdf8d21f961de3c227b7adf7f9d2" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.265025 5200 scope.go:117] "RemoveContainer" containerID="1cb065ef5a3afa68f135814b127b312d334db2bcc6879e0f0a48d70b9a09a7d0" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.302152 5200 scope.go:117] "RemoveContainer" containerID="40d709137da6b2f53759103baa72cc3b0d8e67e3dc4e4e6014e7e28f64cad8aa" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.379521 5200 scope.go:117] "RemoveContainer" containerID="dd1df3fe3624bda2e4605d659f481bd396a2428753503441f5cac1b6400e3793" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.413252 5200 scope.go:117] "RemoveContainer" containerID="ab5913c8e3d9589e5f92e5576e882b6f21b5df61dc31eccb2fadecbd632b03e7" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.435319 5200 scope.go:117] "RemoveContainer" containerID="237e74b2a9070a9cdf1ca3c6faeff26ff070807b7b94c22a1a52300979212498" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.455871 5200 scope.go:117] "RemoveContainer" containerID="b1467a6c8e757d0cebe3388af58f90782af6cdaa6f141e954be99d27e7210802" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.479037 5200 scope.go:117] "RemoveContainer" containerID="76f5da8064003e6153826b45a00197cff1cb02daddb956c84a81e5f15f4da5d8" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.550549 5200 scope.go:117] "RemoveContainer" containerID="dd515b54d795cf2f7c82f49b17aaf60ca06232d30e43913d279f5c017012b23a" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.583934 5200 scope.go:117] "RemoveContainer" containerID="367602e761bf597b8df1b7576ea8c9760abfec581d9ad4e3e0f5abcc608fde62" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.626208 5200 scope.go:117] "RemoveContainer" containerID="6780bed773fef926c2ee46e41b04518f7f0797ec609aa5198d0c453994b1aebc" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.671231 5200 scope.go:117] "RemoveContainer" containerID="415457488fffda18bdfc53b0bac2a90c6c9fe3bafcf3620cbd35fa9928bde6da" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.707896 5200 scope.go:117] "RemoveContainer" containerID="d1b240a40098245ac1b2296169e9eed4b5def878e2367f4306844c7aea45ff74" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.737304 5200 scope.go:117] "RemoveContainer" containerID="1034a642fca91021fb4a1bf5b30b2051f4a68e54854ca3b2ef0040c85597f817" Nov 26 13:57:23 crc kubenswrapper[5200]: I1126 13:57:23.833885 5200 scope.go:117] "RemoveContainer" containerID="a47711b2713db506a1c2b4c89cdd12db3f54982d27690fb2933e78650de8d05e" Nov 26 13:57:33 crc kubenswrapper[5200]: I1126 13:57:33.154932 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 13:57:33 crc kubenswrapper[5200]: I1126 13:57:33.155451 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 13:57:33 crc kubenswrapper[5200]: I1126 13:57:33.155527 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 13:57:33 crc kubenswrapper[5200]: I1126 13:57:33.156827 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 13:57:33 crc kubenswrapper[5200]: I1126 13:57:33.156940 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" gracePeriod=600 Nov 26 13:57:33 crc kubenswrapper[5200]: E1126 13:57:33.301537 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:57:34 crc kubenswrapper[5200]: I1126 13:57:34.258844 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" exitCode=0 Nov 26 13:57:34 crc kubenswrapper[5200]: I1126 13:57:34.259008 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970"} Nov 26 13:57:34 crc kubenswrapper[5200]: I1126 13:57:34.259479 5200 scope.go:117] "RemoveContainer" containerID="8c9e2fd96d697c529382ab47187145fbd161a7e2364741d5b8b26afa12ecb0a5" Nov 26 13:57:34 crc kubenswrapper[5200]: I1126 13:57:34.260638 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:57:34 crc kubenswrapper[5200]: E1126 13:57:34.261210 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:57:42 crc kubenswrapper[5200]: E1126 13:57:42.157825 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167882 actualBytes=10240 Nov 26 13:57:49 crc kubenswrapper[5200]: I1126 13:57:49.969557 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:57:49 crc kubenswrapper[5200]: E1126 13:57:49.970237 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:58:03 crc kubenswrapper[5200]: I1126 13:58:03.969764 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:58:03 crc kubenswrapper[5200]: E1126 13:58:03.970961 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:58:14 crc kubenswrapper[5200]: I1126 13:58:14.969951 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:58:14 crc kubenswrapper[5200]: E1126 13:58:14.970745 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:58:24 crc kubenswrapper[5200]: I1126 13:58:24.353366 5200 scope.go:117] "RemoveContainer" containerID="7503a8964c681bf49d84540f8809922bb7d2d71d2f4e4f1b1afea676f29d0da7" Nov 26 13:58:24 crc kubenswrapper[5200]: I1126 13:58:24.391201 5200 scope.go:117] "RemoveContainer" containerID="5bb653ca071dc9a61e5ecf99b1fe3a17c36b5dd2807279eee649a3530583f056" Nov 26 13:58:24 crc kubenswrapper[5200]: I1126 13:58:24.425334 5200 scope.go:117] "RemoveContainer" containerID="cda7517c7cfab2ef565f5c3fa5201af6cdade851c6625c180d2015174e06c39f" Nov 26 13:58:28 crc kubenswrapper[5200]: I1126 13:58:28.969069 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:58:28 crc kubenswrapper[5200]: E1126 13:58:28.969856 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:58:40 crc kubenswrapper[5200]: I1126 13:58:40.969885 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:58:40 crc kubenswrapper[5200]: E1126 13:58:40.971225 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:58:42 crc kubenswrapper[5200]: E1126 13:58:42.502369 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167876 actualBytes=10240 Nov 26 13:58:53 crc kubenswrapper[5200]: I1126 13:58:53.969911 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:58:53 crc kubenswrapper[5200]: E1126 13:58:53.970857 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:59:06 crc kubenswrapper[5200]: I1126 13:59:06.970559 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:59:06 crc kubenswrapper[5200]: E1126 13:59:06.972307 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:59:20 crc kubenswrapper[5200]: I1126 13:59:20.970362 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:59:20 crc kubenswrapper[5200]: E1126 13:59:20.972253 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.513413 5200 scope.go:117] "RemoveContainer" containerID="68b0094d24ce9ed64996fa8a71a98c96dd392c95b50b7a7799481cf8fdd85c51" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.570294 5200 scope.go:117] "RemoveContainer" containerID="b9bea9b3d60d49f0f23ce4d1f8f0a5cdbb34657ba354b353d520cedd7760db3f" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.625206 5200 scope.go:117] "RemoveContainer" containerID="c9318e677b85daa3ac00ae27709df9fa64f05c09a028ed186bea46eb31efe3bf" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.656427 5200 scope.go:117] "RemoveContainer" containerID="fdb72c20a0584eddffd66dbae804db6f68e5c2f1e04b39b071c742e0af14da1f" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.687483 5200 scope.go:117] "RemoveContainer" containerID="2460558ab936aba0248046a3b63a15a776f86be31eb3e11aa26efff50eeff85c" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.723426 5200 scope.go:117] "RemoveContainer" containerID="f2ee82a44b7d1de45d8bea7c8864c1381ebc8a9c06ecf8f3807c22a0dda52190" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.749987 5200 scope.go:117] "RemoveContainer" containerID="45b0a0d376caba5a2413c25066270cd6614242985b7d52550ad69cc6170e8cd6" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.766287 5200 scope.go:117] "RemoveContainer" containerID="4aa8b6c3334d256efe21f55e401990fa88f4f0e637a7442e68b7624c21e7f69a" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.789511 5200 scope.go:117] "RemoveContainer" containerID="1afaf085c089cd36284f090e60b9e0ede6edf0fcbb9d8f0be71bde2698693e43" Nov 26 13:59:24 crc kubenswrapper[5200]: I1126 13:59:24.831997 5200 scope.go:117] "RemoveContainer" containerID="03b29c837a967963b6848307dc062dac95cb5e7b79937c757a26948129ebd5ec" Nov 26 13:59:31 crc kubenswrapper[5200]: I1126 13:59:31.969227 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:59:31 crc kubenswrapper[5200]: E1126 13:59:31.970184 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:59:41 crc kubenswrapper[5200]: E1126 13:59:41.991490 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167874 actualBytes=10240 Nov 26 13:59:42 crc kubenswrapper[5200]: I1126 13:59:42.988343 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:59:42 crc kubenswrapper[5200]: E1126 13:59:42.989174 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 13:59:53 crc kubenswrapper[5200]: I1126 13:59:53.970153 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 13:59:53 crc kubenswrapper[5200]: E1126 13:59:53.971419 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.174888 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d"] Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.177006 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerName="registry-server" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.177030 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerName="registry-server" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.177076 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerName="extract-content" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.177086 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerName="extract-content" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.177097 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerName="extract-utilities" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.177105 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerName="extract-utilities" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.177327 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a51b64b2-a340-4f43-81fd-8ce68f9c69d7" containerName="registry-server" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.186681 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d"] Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.186974 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.189726 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.190244 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.317159 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/236b4a36-aa29-49ef-a0f2-eb61772cb764-config-volume\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.317327 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/236b4a36-aa29-49ef-a0f2-eb61772cb764-secret-volume\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.317392 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2klkw\" (UniqueName: \"kubernetes.io/projected/236b4a36-aa29-49ef-a0f2-eb61772cb764-kube-api-access-2klkw\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.419925 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2klkw\" (UniqueName: \"kubernetes.io/projected/236b4a36-aa29-49ef-a0f2-eb61772cb764-kube-api-access-2klkw\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.420046 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/236b4a36-aa29-49ef-a0f2-eb61772cb764-config-volume\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.420099 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/236b4a36-aa29-49ef-a0f2-eb61772cb764-secret-volume\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.421961 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/236b4a36-aa29-49ef-a0f2-eb61772cb764-config-volume\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.428814 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/236b4a36-aa29-49ef-a0f2-eb61772cb764-secret-volume\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.442565 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2klkw\" (UniqueName: \"kubernetes.io/projected/236b4a36-aa29-49ef-a0f2-eb61772cb764-kube-api-access-2klkw\") pod \"collect-profiles-29402760-rxf2d\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:00 crc kubenswrapper[5200]: I1126 14:00:00.528876 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:01 crc kubenswrapper[5200]: I1126 14:00:01.062756 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d"] Nov 26 14:00:01 crc kubenswrapper[5200]: I1126 14:00:01.933594 5200 generic.go:358] "Generic (PLEG): container finished" podID="236b4a36-aa29-49ef-a0f2-eb61772cb764" containerID="c38c3385081ffc236fe3f1b574995afe5179a1ed2a52dff40b8e59e67daeae26" exitCode=0 Nov 26 14:00:01 crc kubenswrapper[5200]: I1126 14:00:01.933760 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" event={"ID":"236b4a36-aa29-49ef-a0f2-eb61772cb764","Type":"ContainerDied","Data":"c38c3385081ffc236fe3f1b574995afe5179a1ed2a52dff40b8e59e67daeae26"} Nov 26 14:00:01 crc kubenswrapper[5200]: I1126 14:00:01.933863 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" event={"ID":"236b4a36-aa29-49ef-a0f2-eb61772cb764","Type":"ContainerStarted","Data":"0feaa8efbb2f19058a9b8e9ceca2ed7c64411be8df5248f6c98db0737fe6f6e3"} Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.301577 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.405261 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/236b4a36-aa29-49ef-a0f2-eb61772cb764-config-volume\") pod \"236b4a36-aa29-49ef-a0f2-eb61772cb764\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.405596 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2klkw\" (UniqueName: \"kubernetes.io/projected/236b4a36-aa29-49ef-a0f2-eb61772cb764-kube-api-access-2klkw\") pod \"236b4a36-aa29-49ef-a0f2-eb61772cb764\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.406077 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/236b4a36-aa29-49ef-a0f2-eb61772cb764-secret-volume\") pod \"236b4a36-aa29-49ef-a0f2-eb61772cb764\" (UID: \"236b4a36-aa29-49ef-a0f2-eb61772cb764\") " Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.406535 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236b4a36-aa29-49ef-a0f2-eb61772cb764-config-volume" (OuterVolumeSpecName: "config-volume") pod "236b4a36-aa29-49ef-a0f2-eb61772cb764" (UID: "236b4a36-aa29-49ef-a0f2-eb61772cb764"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.408041 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/236b4a36-aa29-49ef-a0f2-eb61772cb764-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.414437 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236b4a36-aa29-49ef-a0f2-eb61772cb764-kube-api-access-2klkw" (OuterVolumeSpecName: "kube-api-access-2klkw") pod "236b4a36-aa29-49ef-a0f2-eb61772cb764" (UID: "236b4a36-aa29-49ef-a0f2-eb61772cb764"). InnerVolumeSpecName "kube-api-access-2klkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.414663 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236b4a36-aa29-49ef-a0f2-eb61772cb764-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "236b4a36-aa29-49ef-a0f2-eb61772cb764" (UID: "236b4a36-aa29-49ef-a0f2-eb61772cb764"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.509088 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2klkw\" (UniqueName: \"kubernetes.io/projected/236b4a36-aa29-49ef-a0f2-eb61772cb764-kube-api-access-2klkw\") on node \"crc\" DevicePath \"\"" Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.509123 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/236b4a36-aa29-49ef-a0f2-eb61772cb764-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.959774 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" event={"ID":"236b4a36-aa29-49ef-a0f2-eb61772cb764","Type":"ContainerDied","Data":"0feaa8efbb2f19058a9b8e9ceca2ed7c64411be8df5248f6c98db0737fe6f6e3"} Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.959848 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0feaa8efbb2f19058a9b8e9ceca2ed7c64411be8df5248f6c98db0737fe6f6e3" Nov 26 14:00:03 crc kubenswrapper[5200]: I1126 14:00:03.959793 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d" Nov 26 14:00:04 crc kubenswrapper[5200]: E1126 14:00:04.145392 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod236b4a36_aa29_49ef_a0f2_eb61772cb764.slice/crio-0feaa8efbb2f19058a9b8e9ceca2ed7c64411be8df5248f6c98db0737fe6f6e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod236b4a36_aa29_49ef_a0f2_eb61772cb764.slice\": RecentStats: unable to find data in memory cache]" Nov 26 14:00:06 crc kubenswrapper[5200]: I1126 14:00:06.970884 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:00:06 crc kubenswrapper[5200]: E1126 14:00:06.973053 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:00:21 crc kubenswrapper[5200]: I1126 14:00:21.970639 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:00:21 crc kubenswrapper[5200]: E1126 14:00:21.971810 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:00:25 crc kubenswrapper[5200]: I1126 14:00:25.039984 5200 scope.go:117] "RemoveContainer" containerID="913da7fa54a8dadf2cb1d0f96e7c1a4c10b128d92a72d6916595429a588b3727" Nov 26 14:00:25 crc kubenswrapper[5200]: I1126 14:00:25.070848 5200 scope.go:117] "RemoveContainer" containerID="8af815e72ea77b3056855c0473376215523b1abe17dce32a1fe4a0577d3966db" Nov 26 14:00:25 crc kubenswrapper[5200]: I1126 14:00:25.121265 5200 scope.go:117] "RemoveContainer" containerID="78be597c70fe60222ad7f16fc5b17c8bb5fb9f375d97a8edf8fce01c8f2a8af4" Nov 26 14:00:25 crc kubenswrapper[5200]: I1126 14:00:25.147598 5200 scope.go:117] "RemoveContainer" containerID="dae2b75b63a59901de07ca31786301f42b38647175de1bf352b0350a4ab170d9" Nov 26 14:00:35 crc kubenswrapper[5200]: I1126 14:00:35.382672 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:00:35 crc kubenswrapper[5200]: I1126 14:00:35.387575 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:00:35 crc kubenswrapper[5200]: I1126 14:00:35.388899 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:00:35 crc kubenswrapper[5200]: I1126 14:00:35.393457 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:00:36 crc kubenswrapper[5200]: I1126 14:00:36.969577 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:00:36 crc kubenswrapper[5200]: E1126 14:00:36.970532 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:00:42 crc kubenswrapper[5200]: E1126 14:00:42.184104 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167875 actualBytes=10240 Nov 26 14:00:50 crc kubenswrapper[5200]: I1126 14:00:50.969651 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:00:50 crc kubenswrapper[5200]: E1126 14:00:50.970922 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.712560 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mrt9x"] Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.714917 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="236b4a36-aa29-49ef-a0f2-eb61772cb764" containerName="collect-profiles" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.714943 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="236b4a36-aa29-49ef-a0f2-eb61772cb764" containerName="collect-profiles" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.715315 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="236b4a36-aa29-49ef-a0f2-eb61772cb764" containerName="collect-profiles" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.731322 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mrt9x"] Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.731501 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.809713 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz5t7\" (UniqueName: \"kubernetes.io/projected/cd54d656-f23e-46b8-b432-55850457774f-kube-api-access-pz5t7\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.809793 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-catalog-content\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.809897 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-utilities\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.911432 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pz5t7\" (UniqueName: \"kubernetes.io/projected/cd54d656-f23e-46b8-b432-55850457774f-kube-api-access-pz5t7\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.911832 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-catalog-content\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.911862 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-utilities\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.912516 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-utilities\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.912635 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-catalog-content\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:01 crc kubenswrapper[5200]: I1126 14:01:01.941117 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz5t7\" (UniqueName: \"kubernetes.io/projected/cd54d656-f23e-46b8-b432-55850457774f-kube-api-access-pz5t7\") pod \"certified-operators-mrt9x\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:02 crc kubenswrapper[5200]: I1126 14:01:02.062410 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:02 crc kubenswrapper[5200]: I1126 14:01:02.488163 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mrt9x"] Nov 26 14:01:02 crc kubenswrapper[5200]: I1126 14:01:02.602459 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrt9x" event={"ID":"cd54d656-f23e-46b8-b432-55850457774f","Type":"ContainerStarted","Data":"ef42402bf3f221967a1210fc75b78d8d0b53070cf61622cdff7888d054e316e7"} Nov 26 14:01:03 crc kubenswrapper[5200]: I1126 14:01:03.612995 5200 generic.go:358] "Generic (PLEG): container finished" podID="cd54d656-f23e-46b8-b432-55850457774f" containerID="67dd197a40bbb5f78125b20116d34f70e1ffd9b7ce94b054bc6ec455a2625356" exitCode=0 Nov 26 14:01:03 crc kubenswrapper[5200]: I1126 14:01:03.613059 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrt9x" event={"ID":"cd54d656-f23e-46b8-b432-55850457774f","Type":"ContainerDied","Data":"67dd197a40bbb5f78125b20116d34f70e1ffd9b7ce94b054bc6ec455a2625356"} Nov 26 14:01:03 crc kubenswrapper[5200]: I1126 14:01:03.969983 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:01:03 crc kubenswrapper[5200]: E1126 14:01:03.970588 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.621849 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrt9x" event={"ID":"cd54d656-f23e-46b8-b432-55850457774f","Type":"ContainerStarted","Data":"8154869cf21a113e1d539a7c8e6a3cfb279bc214e4211a731f0bc7af42ca5aa3"} Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.697668 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wcs85"] Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.707014 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcs85"] Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.707208 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.862205 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8zgz\" (UniqueName: \"kubernetes.io/projected/9e06b57b-579c-4287-8269-3c9832b72835-kube-api-access-v8zgz\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.862309 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-catalog-content\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.862334 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-utilities\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.881449 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4v27"] Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.888144 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.899020 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4v27"] Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.964020 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8zgz\" (UniqueName: \"kubernetes.io/projected/9e06b57b-579c-4287-8269-3c9832b72835-kube-api-access-v8zgz\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.964141 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-catalog-content\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.964164 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-utilities\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.964601 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-utilities\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.964753 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-catalog-content\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:04 crc kubenswrapper[5200]: I1126 14:01:04.991969 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8zgz\" (UniqueName: \"kubernetes.io/projected/9e06b57b-579c-4287-8269-3c9832b72835-kube-api-access-v8zgz\") pod \"redhat-marketplace-wcs85\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.028488 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.066842 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-utilities\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.066926 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmrrh\" (UniqueName: \"kubernetes.io/projected/ff06912b-7c32-43fe-bd03-9d18798342b9-kube-api-access-gmrrh\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.067120 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-catalog-content\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.174992 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmrrh\" (UniqueName: \"kubernetes.io/projected/ff06912b-7c32-43fe-bd03-9d18798342b9-kube-api-access-gmrrh\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.175810 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-catalog-content\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.176313 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-catalog-content\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.176585 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-utilities\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.176914 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-utilities\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.194235 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmrrh\" (UniqueName: \"kubernetes.io/projected/ff06912b-7c32-43fe-bd03-9d18798342b9-kube-api-access-gmrrh\") pod \"redhat-operators-h4v27\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.253793 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.469588 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcs85"] Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.631969 5200 generic.go:358] "Generic (PLEG): container finished" podID="cd54d656-f23e-46b8-b432-55850457774f" containerID="8154869cf21a113e1d539a7c8e6a3cfb279bc214e4211a731f0bc7af42ca5aa3" exitCode=0 Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.632026 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrt9x" event={"ID":"cd54d656-f23e-46b8-b432-55850457774f","Type":"ContainerDied","Data":"8154869cf21a113e1d539a7c8e6a3cfb279bc214e4211a731f0bc7af42ca5aa3"} Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.634477 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcs85" event={"ID":"9e06b57b-579c-4287-8269-3c9832b72835","Type":"ContainerStarted","Data":"2938c835cfa9101cc00fb1d353d91ae91fcb4146ceb5e90cde2606aecd3743fd"} Nov 26 14:01:05 crc kubenswrapper[5200]: I1126 14:01:05.706256 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4v27"] Nov 26 14:01:05 crc kubenswrapper[5200]: W1126 14:01:05.741008 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff06912b_7c32_43fe_bd03_9d18798342b9.slice/crio-14155159ab51886b8da62c22e4a1b0a4c4c18cd5bc6971b0a3ef9bc3490322f3 WatchSource:0}: Error finding container 14155159ab51886b8da62c22e4a1b0a4c4c18cd5bc6971b0a3ef9bc3490322f3: Status 404 returned error can't find the container with id 14155159ab51886b8da62c22e4a1b0a4c4c18cd5bc6971b0a3ef9bc3490322f3 Nov 26 14:01:06 crc kubenswrapper[5200]: I1126 14:01:06.644612 5200 generic.go:358] "Generic (PLEG): container finished" podID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerID="445e2e4742a90572b0b851667b5b205fb622936fa4f923a30f6dca2d0a3584b0" exitCode=0 Nov 26 14:01:06 crc kubenswrapper[5200]: I1126 14:01:06.644733 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4v27" event={"ID":"ff06912b-7c32-43fe-bd03-9d18798342b9","Type":"ContainerDied","Data":"445e2e4742a90572b0b851667b5b205fb622936fa4f923a30f6dca2d0a3584b0"} Nov 26 14:01:06 crc kubenswrapper[5200]: I1126 14:01:06.645448 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4v27" event={"ID":"ff06912b-7c32-43fe-bd03-9d18798342b9","Type":"ContainerStarted","Data":"14155159ab51886b8da62c22e4a1b0a4c4c18cd5bc6971b0a3ef9bc3490322f3"} Nov 26 14:01:06 crc kubenswrapper[5200]: I1126 14:01:06.652078 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrt9x" event={"ID":"cd54d656-f23e-46b8-b432-55850457774f","Type":"ContainerStarted","Data":"04468d4e2c0ba05ab979afbac14a679427ea077ce7af8d2cdaca2cfaff315d2d"} Nov 26 14:01:06 crc kubenswrapper[5200]: I1126 14:01:06.654944 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e06b57b-579c-4287-8269-3c9832b72835" containerID="3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8" exitCode=0 Nov 26 14:01:06 crc kubenswrapper[5200]: I1126 14:01:06.655378 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcs85" event={"ID":"9e06b57b-579c-4287-8269-3c9832b72835","Type":"ContainerDied","Data":"3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8"} Nov 26 14:01:06 crc kubenswrapper[5200]: I1126 14:01:06.740443 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mrt9x" podStartSLOduration=5.035501666 podStartE2EDuration="5.740409054s" podCreationTimestamp="2025-11-26 14:01:01 +0000 UTC" firstStartedPulling="2025-11-26 14:01:03.614189235 +0000 UTC m=+1832.293391073" lastFinishedPulling="2025-11-26 14:01:04.319096633 +0000 UTC m=+1832.998298461" observedRunningTime="2025-11-26 14:01:06.732237107 +0000 UTC m=+1835.411438965" watchObservedRunningTime="2025-11-26 14:01:06.740409054 +0000 UTC m=+1835.419610912" Nov 26 14:01:08 crc kubenswrapper[5200]: I1126 14:01:08.684412 5200 generic.go:358] "Generic (PLEG): container finished" podID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerID="be7461b6272725aa53389ea23e45e5b4225e42d84115155f98b7732303254f33" exitCode=0 Nov 26 14:01:08 crc kubenswrapper[5200]: I1126 14:01:08.684585 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4v27" event={"ID":"ff06912b-7c32-43fe-bd03-9d18798342b9","Type":"ContainerDied","Data":"be7461b6272725aa53389ea23e45e5b4225e42d84115155f98b7732303254f33"} Nov 26 14:01:08 crc kubenswrapper[5200]: I1126 14:01:08.689855 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e06b57b-579c-4287-8269-3c9832b72835" containerID="09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8" exitCode=0 Nov 26 14:01:08 crc kubenswrapper[5200]: I1126 14:01:08.689981 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcs85" event={"ID":"9e06b57b-579c-4287-8269-3c9832b72835","Type":"ContainerDied","Data":"09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8"} Nov 26 14:01:09 crc kubenswrapper[5200]: I1126 14:01:09.705351 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4v27" event={"ID":"ff06912b-7c32-43fe-bd03-9d18798342b9","Type":"ContainerStarted","Data":"fa135893bce229f089dd5d5789614dcb7515311f0afbddc0559c16ab4e47fe20"} Nov 26 14:01:09 crc kubenswrapper[5200]: I1126 14:01:09.708730 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcs85" event={"ID":"9e06b57b-579c-4287-8269-3c9832b72835","Type":"ContainerStarted","Data":"699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18"} Nov 26 14:01:09 crc kubenswrapper[5200]: I1126 14:01:09.727239 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4v27" podStartSLOduration=4.695022033 podStartE2EDuration="5.727213645s" podCreationTimestamp="2025-11-26 14:01:04 +0000 UTC" firstStartedPulling="2025-11-26 14:01:06.646283621 +0000 UTC m=+1835.325485449" lastFinishedPulling="2025-11-26 14:01:07.678475243 +0000 UTC m=+1836.357677061" observedRunningTime="2025-11-26 14:01:09.724187274 +0000 UTC m=+1838.403389142" watchObservedRunningTime="2025-11-26 14:01:09.727213645 +0000 UTC m=+1838.406415453" Nov 26 14:01:09 crc kubenswrapper[5200]: I1126 14:01:09.751994 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wcs85" podStartSLOduration=4.758293515 podStartE2EDuration="5.751970163s" podCreationTimestamp="2025-11-26 14:01:04 +0000 UTC" firstStartedPulling="2025-11-26 14:01:06.656127133 +0000 UTC m=+1835.335328961" lastFinishedPulling="2025-11-26 14:01:07.649803791 +0000 UTC m=+1836.329005609" observedRunningTime="2025-11-26 14:01:09.746612881 +0000 UTC m=+1838.425814739" watchObservedRunningTime="2025-11-26 14:01:09.751970163 +0000 UTC m=+1838.431171981" Nov 26 14:01:12 crc kubenswrapper[5200]: I1126 14:01:12.062926 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:12 crc kubenswrapper[5200]: I1126 14:01:12.062994 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:12 crc kubenswrapper[5200]: I1126 14:01:12.137018 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:12 crc kubenswrapper[5200]: I1126 14:01:12.796963 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:14 crc kubenswrapper[5200]: I1126 14:01:14.770317 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mrt9x"] Nov 26 14:01:14 crc kubenswrapper[5200]: I1126 14:01:14.770701 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mrt9x" podUID="cd54d656-f23e-46b8-b432-55850457774f" containerName="registry-server" containerID="cri-o://04468d4e2c0ba05ab979afbac14a679427ea077ce7af8d2cdaca2cfaff315d2d" gracePeriod=2 Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.030598 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.030796 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.105412 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.255794 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.255866 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.318754 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.782454 5200 generic.go:358] "Generic (PLEG): container finished" podID="cd54d656-f23e-46b8-b432-55850457774f" containerID="04468d4e2c0ba05ab979afbac14a679427ea077ce7af8d2cdaca2cfaff315d2d" exitCode=0 Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.782556 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrt9x" event={"ID":"cd54d656-f23e-46b8-b432-55850457774f","Type":"ContainerDied","Data":"04468d4e2c0ba05ab979afbac14a679427ea077ce7af8d2cdaca2cfaff315d2d"} Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.783801 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrt9x" event={"ID":"cd54d656-f23e-46b8-b432-55850457774f","Type":"ContainerDied","Data":"ef42402bf3f221967a1210fc75b78d8d0b53070cf61622cdff7888d054e316e7"} Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.783822 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef42402bf3f221967a1210fc75b78d8d0b53070cf61622cdff7888d054e316e7" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.817418 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.830470 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.835046 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.911130 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz5t7\" (UniqueName: \"kubernetes.io/projected/cd54d656-f23e-46b8-b432-55850457774f-kube-api-access-pz5t7\") pod \"cd54d656-f23e-46b8-b432-55850457774f\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.911296 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-catalog-content\") pod \"cd54d656-f23e-46b8-b432-55850457774f\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.911507 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-utilities\") pod \"cd54d656-f23e-46b8-b432-55850457774f\" (UID: \"cd54d656-f23e-46b8-b432-55850457774f\") " Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.912893 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-utilities" (OuterVolumeSpecName: "utilities") pod "cd54d656-f23e-46b8-b432-55850457774f" (UID: "cd54d656-f23e-46b8-b432-55850457774f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.923912 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd54d656-f23e-46b8-b432-55850457774f-kube-api-access-pz5t7" (OuterVolumeSpecName: "kube-api-access-pz5t7") pod "cd54d656-f23e-46b8-b432-55850457774f" (UID: "cd54d656-f23e-46b8-b432-55850457774f"). InnerVolumeSpecName "kube-api-access-pz5t7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.948006 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd54d656-f23e-46b8-b432-55850457774f" (UID: "cd54d656-f23e-46b8-b432-55850457774f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:01:15 crc kubenswrapper[5200]: I1126 14:01:15.969720 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:01:15 crc kubenswrapper[5200]: E1126 14:01:15.970365 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:01:16 crc kubenswrapper[5200]: I1126 14:01:16.013838 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:16 crc kubenswrapper[5200]: I1126 14:01:16.013898 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pz5t7\" (UniqueName: \"kubernetes.io/projected/cd54d656-f23e-46b8-b432-55850457774f-kube-api-access-pz5t7\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:16 crc kubenswrapper[5200]: I1126 14:01:16.013910 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd54d656-f23e-46b8-b432-55850457774f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:16 crc kubenswrapper[5200]: I1126 14:01:16.794478 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrt9x" Nov 26 14:01:16 crc kubenswrapper[5200]: I1126 14:01:16.849736 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mrt9x"] Nov 26 14:01:16 crc kubenswrapper[5200]: I1126 14:01:16.853523 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mrt9x"] Nov 26 14:01:16 crc kubenswrapper[5200]: I1126 14:01:16.985857 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd54d656-f23e-46b8-b432-55850457774f" path="/var/lib/kubelet/pods/cd54d656-f23e-46b8-b432-55850457774f/volumes" Nov 26 14:01:17 crc kubenswrapper[5200]: I1126 14:01:17.486184 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcs85"] Nov 26 14:01:18 crc kubenswrapper[5200]: I1126 14:01:18.490657 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4v27"] Nov 26 14:01:18 crc kubenswrapper[5200]: I1126 14:01:18.492082 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h4v27" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerName="registry-server" containerID="cri-o://fa135893bce229f089dd5d5789614dcb7515311f0afbddc0559c16ab4e47fe20" gracePeriod=2 Nov 26 14:01:18 crc kubenswrapper[5200]: I1126 14:01:18.816705 5200 generic.go:358] "Generic (PLEG): container finished" podID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerID="fa135893bce229f089dd5d5789614dcb7515311f0afbddc0559c16ab4e47fe20" exitCode=0 Nov 26 14:01:18 crc kubenswrapper[5200]: I1126 14:01:18.816784 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4v27" event={"ID":"ff06912b-7c32-43fe-bd03-9d18798342b9","Type":"ContainerDied","Data":"fa135893bce229f089dd5d5789614dcb7515311f0afbddc0559c16ab4e47fe20"} Nov 26 14:01:18 crc kubenswrapper[5200]: I1126 14:01:18.817326 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wcs85" podUID="9e06b57b-579c-4287-8269-3c9832b72835" containerName="registry-server" containerID="cri-o://699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18" gracePeriod=2 Nov 26 14:01:18 crc kubenswrapper[5200]: I1126 14:01:18.953212 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.085453 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-utilities\") pod \"ff06912b-7c32-43fe-bd03-9d18798342b9\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.088957 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-catalog-content\") pod \"ff06912b-7c32-43fe-bd03-9d18798342b9\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.090164 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-utilities" (OuterVolumeSpecName: "utilities") pod "ff06912b-7c32-43fe-bd03-9d18798342b9" (UID: "ff06912b-7c32-43fe-bd03-9d18798342b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.094245 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmrrh\" (UniqueName: \"kubernetes.io/projected/ff06912b-7c32-43fe-bd03-9d18798342b9-kube-api-access-gmrrh\") pod \"ff06912b-7c32-43fe-bd03-9d18798342b9\" (UID: \"ff06912b-7c32-43fe-bd03-9d18798342b9\") " Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.095289 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.107834 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff06912b-7c32-43fe-bd03-9d18798342b9-kube-api-access-gmrrh" (OuterVolumeSpecName: "kube-api-access-gmrrh") pod "ff06912b-7c32-43fe-bd03-9d18798342b9" (UID: "ff06912b-7c32-43fe-bd03-9d18798342b9"). InnerVolumeSpecName "kube-api-access-gmrrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.196796 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmrrh\" (UniqueName: \"kubernetes.io/projected/ff06912b-7c32-43fe-bd03-9d18798342b9-kube-api-access-gmrrh\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.199980 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff06912b-7c32-43fe-bd03-9d18798342b9" (UID: "ff06912b-7c32-43fe-bd03-9d18798342b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.230032 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.299037 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8zgz\" (UniqueName: \"kubernetes.io/projected/9e06b57b-579c-4287-8269-3c9832b72835-kube-api-access-v8zgz\") pod \"9e06b57b-579c-4287-8269-3c9832b72835\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.299409 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-catalog-content\") pod \"9e06b57b-579c-4287-8269-3c9832b72835\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.299452 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-utilities\") pod \"9e06b57b-579c-4287-8269-3c9832b72835\" (UID: \"9e06b57b-579c-4287-8269-3c9832b72835\") " Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.301227 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-utilities" (OuterVolumeSpecName: "utilities") pod "9e06b57b-579c-4287-8269-3c9832b72835" (UID: "9e06b57b-579c-4287-8269-3c9832b72835"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.301654 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff06912b-7c32-43fe-bd03-9d18798342b9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.301724 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.306086 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e06b57b-579c-4287-8269-3c9832b72835-kube-api-access-v8zgz" (OuterVolumeSpecName: "kube-api-access-v8zgz") pod "9e06b57b-579c-4287-8269-3c9832b72835" (UID: "9e06b57b-579c-4287-8269-3c9832b72835"). InnerVolumeSpecName "kube-api-access-v8zgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.311331 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e06b57b-579c-4287-8269-3c9832b72835" (UID: "9e06b57b-579c-4287-8269-3c9832b72835"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.404524 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v8zgz\" (UniqueName: \"kubernetes.io/projected/9e06b57b-579c-4287-8269-3c9832b72835-kube-api-access-v8zgz\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.404677 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e06b57b-579c-4287-8269-3c9832b72835-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.833084 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4v27" event={"ID":"ff06912b-7c32-43fe-bd03-9d18798342b9","Type":"ContainerDied","Data":"14155159ab51886b8da62c22e4a1b0a4c4c18cd5bc6971b0a3ef9bc3490322f3"} Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.833138 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4v27" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.833211 5200 scope.go:117] "RemoveContainer" containerID="fa135893bce229f089dd5d5789614dcb7515311f0afbddc0559c16ab4e47fe20" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.837654 5200 generic.go:358] "Generic (PLEG): container finished" podID="9e06b57b-579c-4287-8269-3c9832b72835" containerID="699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18" exitCode=0 Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.838009 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcs85" event={"ID":"9e06b57b-579c-4287-8269-3c9832b72835","Type":"ContainerDied","Data":"699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18"} Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.838053 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcs85" event={"ID":"9e06b57b-579c-4287-8269-3c9832b72835","Type":"ContainerDied","Data":"2938c835cfa9101cc00fb1d353d91ae91fcb4146ceb5e90cde2606aecd3743fd"} Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.838219 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcs85" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.868245 5200 scope.go:117] "RemoveContainer" containerID="be7461b6272725aa53389ea23e45e5b4225e42d84115155f98b7732303254f33" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.922297 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcs85"] Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.936179 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcs85"] Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.945413 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4v27"] Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.945661 5200 scope.go:117] "RemoveContainer" containerID="445e2e4742a90572b0b851667b5b205fb622936fa4f923a30f6dca2d0a3584b0" Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.952406 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h4v27"] Nov 26 14:01:19 crc kubenswrapper[5200]: I1126 14:01:19.973980 5200 scope.go:117] "RemoveContainer" containerID="699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.018369 5200 scope.go:117] "RemoveContainer" containerID="09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.044745 5200 scope.go:117] "RemoveContainer" containerID="3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.090878 5200 scope.go:117] "RemoveContainer" containerID="699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18" Nov 26 14:01:20 crc kubenswrapper[5200]: E1126 14:01:20.091492 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18\": container with ID starting with 699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18 not found: ID does not exist" containerID="699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.091544 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18"} err="failed to get container status \"699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18\": rpc error: code = NotFound desc = could not find container \"699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18\": container with ID starting with 699db268ce5d743ccf8bb0f8ce5cf179611da754fa7348ce52650d071ceace18 not found: ID does not exist" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.091583 5200 scope.go:117] "RemoveContainer" containerID="09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8" Nov 26 14:01:20 crc kubenswrapper[5200]: E1126 14:01:20.092217 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8\": container with ID starting with 09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8 not found: ID does not exist" containerID="09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.092259 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8"} err="failed to get container status \"09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8\": rpc error: code = NotFound desc = could not find container \"09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8\": container with ID starting with 09d24f5ebdbd55e5d3dab4a34f12a189f3a1911f14344a765cc77caedf174aa8 not found: ID does not exist" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.092282 5200 scope.go:117] "RemoveContainer" containerID="3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8" Nov 26 14:01:20 crc kubenswrapper[5200]: E1126 14:01:20.092600 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8\": container with ID starting with 3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8 not found: ID does not exist" containerID="3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.092633 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8"} err="failed to get container status \"3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8\": rpc error: code = NotFound desc = could not find container \"3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8\": container with ID starting with 3873c2780d238b15d62d3bbd0b3b743ecd30da41fdc4e683e8721c5669bc72d8 not found: ID does not exist" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.986902 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e06b57b-579c-4287-8269-3c9832b72835" path="/var/lib/kubelet/pods/9e06b57b-579c-4287-8269-3c9832b72835/volumes" Nov 26 14:01:20 crc kubenswrapper[5200]: I1126 14:01:20.989758 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" path="/var/lib/kubelet/pods/ff06912b-7c32-43fe-bd03-9d18798342b9/volumes" Nov 26 14:01:25 crc kubenswrapper[5200]: I1126 14:01:25.236000 5200 scope.go:117] "RemoveContainer" containerID="db44a245ce96292b488afb54ed4cd43056384ab28322643e447b6c397b81175f" Nov 26 14:01:25 crc kubenswrapper[5200]: I1126 14:01:25.285027 5200 scope.go:117] "RemoveContainer" containerID="30a3209f439deb9a511a8367f54f1b470b4ab50a043f4340c8e0dff3bed96a28" Nov 26 14:01:25 crc kubenswrapper[5200]: I1126 14:01:25.333548 5200 scope.go:117] "RemoveContainer" containerID="04f3c7f81e26aea0c5d43348c1a7d828024b5a90fc0938d3e7f0adb21f560dc2" Nov 26 14:01:25 crc kubenswrapper[5200]: I1126 14:01:25.364066 5200 scope.go:117] "RemoveContainer" containerID="958957b973036b5e2eae357d60b2788b353e1ed18b3527c2edf63e87c97959a7" Nov 26 14:01:25 crc kubenswrapper[5200]: I1126 14:01:25.403171 5200 scope.go:117] "RemoveContainer" containerID="f2897af7d677ef9e566180c09815773b0b5d2d1175b9ce424e5ca281bca9c086" Nov 26 14:01:30 crc kubenswrapper[5200]: I1126 14:01:30.969434 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:01:30 crc kubenswrapper[5200]: E1126 14:01:30.970591 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:01:42 crc kubenswrapper[5200]: E1126 14:01:42.323210 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169200 actualBytes=10240 Nov 26 14:01:45 crc kubenswrapper[5200]: I1126 14:01:45.971408 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:01:45 crc kubenswrapper[5200]: E1126 14:01:45.972508 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:02:00 crc kubenswrapper[5200]: I1126 14:02:00.972025 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:02:00 crc kubenswrapper[5200]: E1126 14:02:00.973562 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:02:11 crc kubenswrapper[5200]: I1126 14:02:11.969380 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:02:11 crc kubenswrapper[5200]: E1126 14:02:11.970275 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:02:24 crc kubenswrapper[5200]: I1126 14:02:24.969760 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:02:24 crc kubenswrapper[5200]: E1126 14:02:24.970743 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:02:39 crc kubenswrapper[5200]: I1126 14:02:39.969851 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:02:39 crc kubenswrapper[5200]: I1126 14:02:39.974798 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:02:40 crc kubenswrapper[5200]: I1126 14:02:40.738776 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"26e143ab2dbec69cb9afadff67068536aa38d2ad1fc43cc58a18e82250488c75"} Nov 26 14:02:42 crc kubenswrapper[5200]: E1126 14:02:42.339255 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169199 actualBytes=10240 Nov 26 14:03:42 crc kubenswrapper[5200]: E1126 14:03:42.189465 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169199 actualBytes=10240 Nov 26 14:04:41 crc kubenswrapper[5200]: E1126 14:04:41.946484 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167875 actualBytes=10240 Nov 26 14:05:03 crc kubenswrapper[5200]: I1126 14:05:03.154404 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:05:03 crc kubenswrapper[5200]: I1126 14:05:03.155472 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:05:33 crc kubenswrapper[5200]: I1126 14:05:33.154787 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:05:33 crc kubenswrapper[5200]: I1126 14:05:33.156022 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:05:35 crc kubenswrapper[5200]: I1126 14:05:35.559822 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:05:35 crc kubenswrapper[5200]: I1126 14:05:35.569507 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:05:35 crc kubenswrapper[5200]: I1126 14:05:35.571084 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:05:35 crc kubenswrapper[5200]: I1126 14:05:35.577415 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:05:42 crc kubenswrapper[5200]: E1126 14:05:42.186185 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167873 actualBytes=10240 Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.154484 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.155294 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.155360 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.156097 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26e143ab2dbec69cb9afadff67068536aa38d2ad1fc43cc58a18e82250488c75"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.156173 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://26e143ab2dbec69cb9afadff67068536aa38d2ad1fc43cc58a18e82250488c75" gracePeriod=600 Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.932540 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="26e143ab2dbec69cb9afadff67068536aa38d2ad1fc43cc58a18e82250488c75" exitCode=0 Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.932627 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"26e143ab2dbec69cb9afadff67068536aa38d2ad1fc43cc58a18e82250488c75"} Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.933195 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b"} Nov 26 14:06:03 crc kubenswrapper[5200]: I1126 14:06:03.933222 5200 scope.go:117] "RemoveContainer" containerID="5994662cd15f06ae447b7d601bcf6adba56502c8e0f564415b2f0319ee50e970" Nov 26 14:06:42 crc kubenswrapper[5200]: E1126 14:06:42.221662 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167872 actualBytes=10240 Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.500291 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cjvgk"] Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502280 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e06b57b-579c-4287-8269-3c9832b72835" containerName="extract-content" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502299 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e06b57b-579c-4287-8269-3c9832b72835" containerName="extract-content" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502320 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd54d656-f23e-46b8-b432-55850457774f" containerName="extract-utilities" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502330 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd54d656-f23e-46b8-b432-55850457774f" containerName="extract-utilities" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502360 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e06b57b-579c-4287-8269-3c9832b72835" containerName="extract-utilities" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502368 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e06b57b-579c-4287-8269-3c9832b72835" containerName="extract-utilities" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502384 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerName="extract-utilities" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502390 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerName="extract-utilities" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502410 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502417 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502440 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerName="extract-content" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502447 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerName="extract-content" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502460 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd54d656-f23e-46b8-b432-55850457774f" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502468 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd54d656-f23e-46b8-b432-55850457774f" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502484 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd54d656-f23e-46b8-b432-55850457774f" containerName="extract-content" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502491 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd54d656-f23e-46b8-b432-55850457774f" containerName="extract-content" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502500 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e06b57b-579c-4287-8269-3c9832b72835" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502506 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e06b57b-579c-4287-8269-3c9832b72835" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502701 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff06912b-7c32-43fe-bd03-9d18798342b9" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502723 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd54d656-f23e-46b8-b432-55850457774f" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.502737 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e06b57b-579c-4287-8269-3c9832b72835" containerName="registry-server" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.509809 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cjvgk"] Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.509980 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.620389 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-catalog-content\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.620478 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6psrf\" (UniqueName: \"kubernetes.io/projected/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-kube-api-access-6psrf\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.620589 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-utilities\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.721586 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-utilities\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.721692 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-catalog-content\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.721721 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6psrf\" (UniqueName: \"kubernetes.io/projected/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-kube-api-access-6psrf\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.722237 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-utilities\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.722328 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-catalog-content\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.742715 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6psrf\" (UniqueName: \"kubernetes.io/projected/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-kube-api-access-6psrf\") pod \"community-operators-cjvgk\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:11 crc kubenswrapper[5200]: I1126 14:07:11.830525 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:12 crc kubenswrapper[5200]: I1126 14:07:12.357900 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cjvgk"] Nov 26 14:07:12 crc kubenswrapper[5200]: I1126 14:07:12.672413 5200 generic.go:358] "Generic (PLEG): container finished" podID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerID="07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145" exitCode=0 Nov 26 14:07:12 crc kubenswrapper[5200]: I1126 14:07:12.672604 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjvgk" event={"ID":"ba7ad2c6-569b-410e-9dd9-546796b1ca4c","Type":"ContainerDied","Data":"07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145"} Nov 26 14:07:12 crc kubenswrapper[5200]: I1126 14:07:12.672729 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjvgk" event={"ID":"ba7ad2c6-569b-410e-9dd9-546796b1ca4c","Type":"ContainerStarted","Data":"67ba7225da165547397615040e1d7c6719f5de840a9f989c183581b9ce599988"} Nov 26 14:07:14 crc kubenswrapper[5200]: I1126 14:07:14.697150 5200 generic.go:358] "Generic (PLEG): container finished" podID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerID="746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac" exitCode=0 Nov 26 14:07:14 crc kubenswrapper[5200]: I1126 14:07:14.697278 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjvgk" event={"ID":"ba7ad2c6-569b-410e-9dd9-546796b1ca4c","Type":"ContainerDied","Data":"746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac"} Nov 26 14:07:15 crc kubenswrapper[5200]: I1126 14:07:15.714175 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjvgk" event={"ID":"ba7ad2c6-569b-410e-9dd9-546796b1ca4c","Type":"ContainerStarted","Data":"e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3"} Nov 26 14:07:15 crc kubenswrapper[5200]: I1126 14:07:15.764730 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cjvgk" podStartSLOduration=3.455436651 podStartE2EDuration="4.76470351s" podCreationTimestamp="2025-11-26 14:07:11 +0000 UTC" firstStartedPulling="2025-11-26 14:07:12.674008067 +0000 UTC m=+2201.353209895" lastFinishedPulling="2025-11-26 14:07:13.983274946 +0000 UTC m=+2202.662476754" observedRunningTime="2025-11-26 14:07:15.740555999 +0000 UTC m=+2204.419757827" watchObservedRunningTime="2025-11-26 14:07:15.76470351 +0000 UTC m=+2204.443905318" Nov 26 14:07:21 crc kubenswrapper[5200]: I1126 14:07:21.831420 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:21 crc kubenswrapper[5200]: I1126 14:07:21.831928 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:21 crc kubenswrapper[5200]: I1126 14:07:21.911574 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:22 crc kubenswrapper[5200]: I1126 14:07:22.870768 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:22 crc kubenswrapper[5200]: I1126 14:07:22.943570 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cjvgk"] Nov 26 14:07:24 crc kubenswrapper[5200]: I1126 14:07:24.842552 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cjvgk" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerName="registry-server" containerID="cri-o://e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3" gracePeriod=2 Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.308536 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.392754 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-utilities\") pod \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.392837 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6psrf\" (UniqueName: \"kubernetes.io/projected/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-kube-api-access-6psrf\") pod \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.392967 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-catalog-content\") pod \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\" (UID: \"ba7ad2c6-569b-410e-9dd9-546796b1ca4c\") " Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.395784 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-utilities" (OuterVolumeSpecName: "utilities") pod "ba7ad2c6-569b-410e-9dd9-546796b1ca4c" (UID: "ba7ad2c6-569b-410e-9dd9-546796b1ca4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.407532 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-kube-api-access-6psrf" (OuterVolumeSpecName: "kube-api-access-6psrf") pod "ba7ad2c6-569b-410e-9dd9-546796b1ca4c" (UID: "ba7ad2c6-569b-410e-9dd9-546796b1ca4c"). InnerVolumeSpecName "kube-api-access-6psrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.467729 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba7ad2c6-569b-410e-9dd9-546796b1ca4c" (UID: "ba7ad2c6-569b-410e-9dd9-546796b1ca4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.495756 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.495807 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6psrf\" (UniqueName: \"kubernetes.io/projected/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-kube-api-access-6psrf\") on node \"crc\" DevicePath \"\"" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.495826 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7ad2c6-569b-410e-9dd9-546796b1ca4c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.685927 5200 scope.go:117] "RemoveContainer" containerID="04468d4e2c0ba05ab979afbac14a679427ea077ce7af8d2cdaca2cfaff315d2d" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.715805 5200 scope.go:117] "RemoveContainer" containerID="8154869cf21a113e1d539a7c8e6a3cfb279bc214e4211a731f0bc7af42ca5aa3" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.755128 5200 scope.go:117] "RemoveContainer" containerID="67dd197a40bbb5f78125b20116d34f70e1ffd9b7ce94b054bc6ec455a2625356" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.852164 5200 generic.go:358] "Generic (PLEG): container finished" podID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerID="e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3" exitCode=0 Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.852278 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjvgk" event={"ID":"ba7ad2c6-569b-410e-9dd9-546796b1ca4c","Type":"ContainerDied","Data":"e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3"} Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.852732 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjvgk" event={"ID":"ba7ad2c6-569b-410e-9dd9-546796b1ca4c","Type":"ContainerDied","Data":"67ba7225da165547397615040e1d7c6719f5de840a9f989c183581b9ce599988"} Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.852767 5200 scope.go:117] "RemoveContainer" containerID="e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.852300 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjvgk" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.878364 5200 scope.go:117] "RemoveContainer" containerID="746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.911312 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cjvgk"] Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.918854 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cjvgk"] Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.924752 5200 scope.go:117] "RemoveContainer" containerID="07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.949545 5200 scope.go:117] "RemoveContainer" containerID="e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3" Nov 26 14:07:25 crc kubenswrapper[5200]: E1126 14:07:25.950070 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3\": container with ID starting with e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3 not found: ID does not exist" containerID="e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.950119 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3"} err="failed to get container status \"e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3\": rpc error: code = NotFound desc = could not find container \"e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3\": container with ID starting with e8ba29948575d602fe6f3726bd16440c365460fa69f9dfbce0a045b4653743b3 not found: ID does not exist" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.950155 5200 scope.go:117] "RemoveContainer" containerID="746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac" Nov 26 14:07:25 crc kubenswrapper[5200]: E1126 14:07:25.950506 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac\": container with ID starting with 746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac not found: ID does not exist" containerID="746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.950649 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac"} err="failed to get container status \"746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac\": rpc error: code = NotFound desc = could not find container \"746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac\": container with ID starting with 746f3f96950c61678ee08adcd0c3a0abd9ad686643531dfb50dd58e19c1b0dac not found: ID does not exist" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.950801 5200 scope.go:117] "RemoveContainer" containerID="07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145" Nov 26 14:07:25 crc kubenswrapper[5200]: E1126 14:07:25.956073 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145\": container with ID starting with 07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145 not found: ID does not exist" containerID="07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145" Nov 26 14:07:25 crc kubenswrapper[5200]: I1126 14:07:25.956109 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145"} err="failed to get container status \"07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145\": rpc error: code = NotFound desc = could not find container \"07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145\": container with ID starting with 07f63f644b5b3f3a375cc1174a691232092c0b7e467b1b6d7dfd8c2600508145 not found: ID does not exist" Nov 26 14:07:26 crc kubenswrapper[5200]: I1126 14:07:26.983348 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" path="/var/lib/kubelet/pods/ba7ad2c6-569b-410e-9dd9-546796b1ca4c/volumes" Nov 26 14:07:42 crc kubenswrapper[5200]: E1126 14:07:42.436984 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167878 actualBytes=10240 Nov 26 14:08:03 crc kubenswrapper[5200]: I1126 14:08:03.155073 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:08:03 crc kubenswrapper[5200]: I1126 14:08:03.155902 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:08:33 crc kubenswrapper[5200]: I1126 14:08:33.154779 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:08:33 crc kubenswrapper[5200]: I1126 14:08:33.155777 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:08:42 crc kubenswrapper[5200]: E1126 14:08:42.746738 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167883 actualBytes=10240 Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.154560 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.155895 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.156013 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.157450 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.157611 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" gracePeriod=600 Nov 26 14:09:03 crc kubenswrapper[5200]: E1126 14:09:03.299961 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.949481 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" exitCode=0 Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.949570 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b"} Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.949647 5200 scope.go:117] "RemoveContainer" containerID="26e143ab2dbec69cb9afadff67068536aa38d2ad1fc43cc58a18e82250488c75" Nov 26 14:09:03 crc kubenswrapper[5200]: I1126 14:09:03.950496 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:09:03 crc kubenswrapper[5200]: E1126 14:09:03.951208 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:09:14 crc kubenswrapper[5200]: I1126 14:09:14.969838 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:09:14 crc kubenswrapper[5200]: E1126 14:09:14.971456 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:09:27 crc kubenswrapper[5200]: I1126 14:09:27.970196 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:09:27 crc kubenswrapper[5200]: E1126 14:09:27.971178 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:09:38 crc kubenswrapper[5200]: I1126 14:09:38.970226 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:09:38 crc kubenswrapper[5200]: E1126 14:09:38.971459 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:09:42 crc kubenswrapper[5200]: E1126 14:09:42.989386 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169202 actualBytes=10240 Nov 26 14:09:50 crc kubenswrapper[5200]: I1126 14:09:50.970323 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:09:50 crc kubenswrapper[5200]: E1126 14:09:50.972467 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:10:01 crc kubenswrapper[5200]: I1126 14:10:01.969769 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:10:01 crc kubenswrapper[5200]: E1126 14:10:01.971355 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:10:13 crc kubenswrapper[5200]: I1126 14:10:13.970036 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:10:13 crc kubenswrapper[5200]: E1126 14:10:13.971062 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:10:28 crc kubenswrapper[5200]: I1126 14:10:28.970571 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:10:28 crc kubenswrapper[5200]: E1126 14:10:28.972571 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:10:35 crc kubenswrapper[5200]: I1126 14:10:35.691158 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:10:35 crc kubenswrapper[5200]: I1126 14:10:35.699939 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:10:35 crc kubenswrapper[5200]: I1126 14:10:35.707230 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:10:35 crc kubenswrapper[5200]: I1126 14:10:35.712180 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:10:40 crc kubenswrapper[5200]: I1126 14:10:40.970941 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:10:40 crc kubenswrapper[5200]: E1126 14:10:40.973635 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:10:42 crc kubenswrapper[5200]: E1126 14:10:42.272353 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169206 actualBytes=10240 Nov 26 14:10:54 crc kubenswrapper[5200]: I1126 14:10:54.970551 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:10:54 crc kubenswrapper[5200]: E1126 14:10:54.972003 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.439932 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4q74v"] Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.442288 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerName="extract-utilities" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.442310 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerName="extract-utilities" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.442351 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerName="extract-content" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.442360 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerName="extract-content" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.442377 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerName="registry-server" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.442384 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerName="registry-server" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.442558 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba7ad2c6-569b-410e-9dd9-546796b1ca4c" containerName="registry-server" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.452535 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.467558 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4q74v"] Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.556558 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj87h\" (UniqueName: \"kubernetes.io/projected/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-kube-api-access-kj87h\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.557117 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-catalog-content\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.557864 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-utilities\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.660173 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-utilities\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.660640 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj87h\" (UniqueName: \"kubernetes.io/projected/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-kube-api-access-kj87h\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.660837 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-catalog-content\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.661489 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-utilities\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.661519 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-catalog-content\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.692526 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj87h\" (UniqueName: \"kubernetes.io/projected/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-kube-api-access-kj87h\") pod \"redhat-marketplace-4q74v\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.788457 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:06 crc kubenswrapper[5200]: I1126 14:11:06.996157 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:11:06 crc kubenswrapper[5200]: E1126 14:11:06.996509 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.332398 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4q74v"] Nov 26 14:11:07 crc kubenswrapper[5200]: W1126 14:11:07.355006 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475d84ae_cd46_4a7c_a4cf_a0749f8e10f0.slice/crio-ce71adfc0690809920ff85b3d47f3a0374b1b2d81b0aa08108e428537c7ca030 WatchSource:0}: Error finding container ce71adfc0690809920ff85b3d47f3a0374b1b2d81b0aa08108e428537c7ca030: Status 404 returned error can't find the container with id ce71adfc0690809920ff85b3d47f3a0374b1b2d81b0aa08108e428537c7ca030 Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.360130 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.399951 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4q74v" event={"ID":"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0","Type":"ContainerStarted","Data":"ce71adfc0690809920ff85b3d47f3a0374b1b2d81b0aa08108e428537c7ca030"} Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.441206 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l5jrv"] Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.451658 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.458972 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5jrv"] Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.579382 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-catalog-content\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.579544 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8cfz\" (UniqueName: \"kubernetes.io/projected/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-kube-api-access-x8cfz\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.579929 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-utilities\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.682336 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-utilities\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.682427 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-catalog-content\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.682472 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8cfz\" (UniqueName: \"kubernetes.io/projected/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-kube-api-access-x8cfz\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.683101 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-utilities\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.683192 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-catalog-content\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.715945 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8cfz\" (UniqueName: \"kubernetes.io/projected/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-kube-api-access-x8cfz\") pod \"redhat-operators-l5jrv\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:07 crc kubenswrapper[5200]: I1126 14:11:07.796100 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.066042 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5jrv"] Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.411944 5200 generic.go:358] "Generic (PLEG): container finished" podID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerID="cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7" exitCode=0 Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.412101 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5jrv" event={"ID":"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf","Type":"ContainerDied","Data":"cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7"} Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.412172 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5jrv" event={"ID":"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf","Type":"ContainerStarted","Data":"56c2c29a3339c737a3eca029003afe3eaf338633d37c6a48f2d25db4269e71d6"} Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.415098 5200 generic.go:358] "Generic (PLEG): container finished" podID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerID="ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37" exitCode=0 Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.415150 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4q74v" event={"ID":"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0","Type":"ContainerDied","Data":"ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37"} Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.823716 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dc2mn"] Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.830673 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:08 crc kubenswrapper[5200]: I1126 14:11:08.887000 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc2mn"] Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.006389 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc6kc\" (UniqueName: \"kubernetes.io/projected/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-kube-api-access-mc6kc\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.006512 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-utilities\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.006563 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-catalog-content\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.114343 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc6kc\" (UniqueName: \"kubernetes.io/projected/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-kube-api-access-mc6kc\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.114857 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-utilities\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.115036 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-catalog-content\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.115949 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-utilities\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.117080 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-catalog-content\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.150705 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc6kc\" (UniqueName: \"kubernetes.io/projected/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-kube-api-access-mc6kc\") pod \"certified-operators-dc2mn\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.187028 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.427943 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4q74v" event={"ID":"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0","Type":"ContainerStarted","Data":"6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f"} Nov 26 14:11:09 crc kubenswrapper[5200]: I1126 14:11:09.708541 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc2mn"] Nov 26 14:11:09 crc kubenswrapper[5200]: W1126 14:11:09.717467 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda452a36b_5afe_4f53_9d82_d5776a4b4aa8.slice/crio-8f32b9e073e2f238fd4d6ca38f6a9dcda31e8f1a0ac665e6fb9a45a0b24673da WatchSource:0}: Error finding container 8f32b9e073e2f238fd4d6ca38f6a9dcda31e8f1a0ac665e6fb9a45a0b24673da: Status 404 returned error can't find the container with id 8f32b9e073e2f238fd4d6ca38f6a9dcda31e8f1a0ac665e6fb9a45a0b24673da Nov 26 14:11:10 crc kubenswrapper[5200]: I1126 14:11:10.443150 5200 generic.go:358] "Generic (PLEG): container finished" podID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerID="6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f" exitCode=0 Nov 26 14:11:10 crc kubenswrapper[5200]: I1126 14:11:10.443372 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4q74v" event={"ID":"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0","Type":"ContainerDied","Data":"6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f"} Nov 26 14:11:10 crc kubenswrapper[5200]: I1126 14:11:10.447560 5200 generic.go:358] "Generic (PLEG): container finished" podID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerID="84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe" exitCode=0 Nov 26 14:11:10 crc kubenswrapper[5200]: I1126 14:11:10.447725 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc2mn" event={"ID":"a452a36b-5afe-4f53-9d82-d5776a4b4aa8","Type":"ContainerDied","Data":"84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe"} Nov 26 14:11:10 crc kubenswrapper[5200]: I1126 14:11:10.447794 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc2mn" event={"ID":"a452a36b-5afe-4f53-9d82-d5776a4b4aa8","Type":"ContainerStarted","Data":"8f32b9e073e2f238fd4d6ca38f6a9dcda31e8f1a0ac665e6fb9a45a0b24673da"} Nov 26 14:11:10 crc kubenswrapper[5200]: I1126 14:11:10.452662 5200 generic.go:358] "Generic (PLEG): container finished" podID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerID="cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3" exitCode=0 Nov 26 14:11:10 crc kubenswrapper[5200]: I1126 14:11:10.452932 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5jrv" event={"ID":"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf","Type":"ContainerDied","Data":"cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3"} Nov 26 14:11:11 crc kubenswrapper[5200]: I1126 14:11:11.470400 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5jrv" event={"ID":"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf","Type":"ContainerStarted","Data":"c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963"} Nov 26 14:11:11 crc kubenswrapper[5200]: I1126 14:11:11.475005 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4q74v" event={"ID":"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0","Type":"ContainerStarted","Data":"c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d"} Nov 26 14:11:11 crc kubenswrapper[5200]: I1126 14:11:11.503679 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l5jrv" podStartSLOduration=3.4626415870000002 podStartE2EDuration="4.50365195s" podCreationTimestamp="2025-11-26 14:11:07 +0000 UTC" firstStartedPulling="2025-11-26 14:11:08.413800615 +0000 UTC m=+2437.093002473" lastFinishedPulling="2025-11-26 14:11:09.454811008 +0000 UTC m=+2438.134012836" observedRunningTime="2025-11-26 14:11:11.497811002 +0000 UTC m=+2440.177012880" watchObservedRunningTime="2025-11-26 14:11:11.50365195 +0000 UTC m=+2440.182853778" Nov 26 14:11:11 crc kubenswrapper[5200]: I1126 14:11:11.531938 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4q74v" podStartSLOduration=4.795532243 podStartE2EDuration="5.531902423s" podCreationTimestamp="2025-11-26 14:11:06 +0000 UTC" firstStartedPulling="2025-11-26 14:11:08.416153858 +0000 UTC m=+2437.095355716" lastFinishedPulling="2025-11-26 14:11:09.152524078 +0000 UTC m=+2437.831725896" observedRunningTime="2025-11-26 14:11:11.52736739 +0000 UTC m=+2440.206569278" watchObservedRunningTime="2025-11-26 14:11:11.531902423 +0000 UTC m=+2440.211104281" Nov 26 14:11:12 crc kubenswrapper[5200]: I1126 14:11:12.493016 5200 generic.go:358] "Generic (PLEG): container finished" podID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerID="6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882" exitCode=0 Nov 26 14:11:12 crc kubenswrapper[5200]: I1126 14:11:12.493149 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc2mn" event={"ID":"a452a36b-5afe-4f53-9d82-d5776a4b4aa8","Type":"ContainerDied","Data":"6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882"} Nov 26 14:11:13 crc kubenswrapper[5200]: I1126 14:11:13.511024 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc2mn" event={"ID":"a452a36b-5afe-4f53-9d82-d5776a4b4aa8","Type":"ContainerStarted","Data":"09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e"} Nov 26 14:11:13 crc kubenswrapper[5200]: I1126 14:11:13.552062 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dc2mn" podStartSLOduration=4.700433698 podStartE2EDuration="5.552034438s" podCreationTimestamp="2025-11-26 14:11:08 +0000 UTC" firstStartedPulling="2025-11-26 14:11:10.449334247 +0000 UTC m=+2439.128536105" lastFinishedPulling="2025-11-26 14:11:11.300935027 +0000 UTC m=+2439.980136845" observedRunningTime="2025-11-26 14:11:13.545239695 +0000 UTC m=+2442.224441523" watchObservedRunningTime="2025-11-26 14:11:13.552034438 +0000 UTC m=+2442.231236266" Nov 26 14:11:16 crc kubenswrapper[5200]: I1126 14:11:16.789547 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:16 crc kubenswrapper[5200]: I1126 14:11:16.790198 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:16 crc kubenswrapper[5200]: I1126 14:11:16.846110 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:17 crc kubenswrapper[5200]: I1126 14:11:17.619012 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:17 crc kubenswrapper[5200]: I1126 14:11:17.797116 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:17 crc kubenswrapper[5200]: I1126 14:11:17.797198 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:17 crc kubenswrapper[5200]: I1126 14:11:17.841163 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:18 crc kubenswrapper[5200]: I1126 14:11:18.653827 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:18 crc kubenswrapper[5200]: I1126 14:11:18.828355 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4q74v"] Nov 26 14:11:19 crc kubenswrapper[5200]: I1126 14:11:19.188121 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:19 crc kubenswrapper[5200]: I1126 14:11:19.188902 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:19 crc kubenswrapper[5200]: I1126 14:11:19.238882 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:19 crc kubenswrapper[5200]: I1126 14:11:19.595180 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4q74v" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerName="registry-server" containerID="cri-o://c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d" gracePeriod=2 Nov 26 14:11:19 crc kubenswrapper[5200]: I1126 14:11:19.660766 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.081753 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.226647 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-utilities\") pod \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.226935 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj87h\" (UniqueName: \"kubernetes.io/projected/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-kube-api-access-kj87h\") pod \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.227143 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-catalog-content\") pod \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\" (UID: \"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0\") " Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.231297 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-utilities" (OuterVolumeSpecName: "utilities") pod "475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" (UID: "475d84ae-cd46-4a7c-a4cf-a0749f8e10f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.238494 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5jrv"] Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.244953 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" (UID: "475d84ae-cd46-4a7c-a4cf-a0749f8e10f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.250020 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-kube-api-access-kj87h" (OuterVolumeSpecName: "kube-api-access-kj87h") pod "475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" (UID: "475d84ae-cd46-4a7c-a4cf-a0749f8e10f0"). InnerVolumeSpecName "kube-api-access-kj87h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.330223 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.330283 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.330336 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kj87h\" (UniqueName: \"kubernetes.io/projected/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0-kube-api-access-kj87h\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.612629 5200 generic.go:358] "Generic (PLEG): container finished" podID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerID="c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d" exitCode=0 Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.612736 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4q74v" event={"ID":"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0","Type":"ContainerDied","Data":"c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d"} Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.612793 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4q74v" event={"ID":"475d84ae-cd46-4a7c-a4cf-a0749f8e10f0","Type":"ContainerDied","Data":"ce71adfc0690809920ff85b3d47f3a0374b1b2d81b0aa08108e428537c7ca030"} Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.612815 5200 scope.go:117] "RemoveContainer" containerID="c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.612833 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4q74v" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.613449 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l5jrv" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerName="registry-server" containerID="cri-o://c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963" gracePeriod=2 Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.644830 5200 scope.go:117] "RemoveContainer" containerID="6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.682534 5200 scope.go:117] "RemoveContainer" containerID="ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.688871 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4q74v"] Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.693045 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4q74v"] Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.818517 5200 scope.go:117] "RemoveContainer" containerID="c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d" Nov 26 14:11:20 crc kubenswrapper[5200]: E1126 14:11:20.819139 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d\": container with ID starting with c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d not found: ID does not exist" containerID="c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.819195 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d"} err="failed to get container status \"c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d\": rpc error: code = NotFound desc = could not find container \"c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d\": container with ID starting with c1ea850c98c0741b6f9f59e6d2d189396e8ffed9779ff5abfe4d95a40484f48d not found: ID does not exist" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.819230 5200 scope.go:117] "RemoveContainer" containerID="6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f" Nov 26 14:11:20 crc kubenswrapper[5200]: E1126 14:11:20.819645 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f\": container with ID starting with 6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f not found: ID does not exist" containerID="6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.819721 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f"} err="failed to get container status \"6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f\": rpc error: code = NotFound desc = could not find container \"6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f\": container with ID starting with 6e8c5e6c776a0cecbdd44f020c5a1f729ea4ccb6268a05453d5d76806824730f not found: ID does not exist" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.819744 5200 scope.go:117] "RemoveContainer" containerID="ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37" Nov 26 14:11:20 crc kubenswrapper[5200]: E1126 14:11:20.820073 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37\": container with ID starting with ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37 not found: ID does not exist" containerID="ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.820115 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37"} err="failed to get container status \"ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37\": rpc error: code = NotFound desc = could not find container \"ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37\": container with ID starting with ef1746a0fc522cbc40db2a101fb63e18f09ce6f6a792b249a533691eefe19a37 not found: ID does not exist" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.971964 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:11:20 crc kubenswrapper[5200]: E1126 14:11:20.972494 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:11:20 crc kubenswrapper[5200]: I1126 14:11:20.983515 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" path="/var/lib/kubelet/pods/475d84ae-cd46-4a7c-a4cf-a0749f8e10f0/volumes" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.045922 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.146226 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8cfz\" (UniqueName: \"kubernetes.io/projected/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-kube-api-access-x8cfz\") pod \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.146307 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-catalog-content\") pod \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.146724 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-utilities\") pod \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\" (UID: \"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf\") " Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.148588 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-utilities" (OuterVolumeSpecName: "utilities") pod "8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" (UID: "8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.153667 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-kube-api-access-x8cfz" (OuterVolumeSpecName: "kube-api-access-x8cfz") pod "8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" (UID: "8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf"). InnerVolumeSpecName "kube-api-access-x8cfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.249131 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.249186 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8cfz\" (UniqueName: \"kubernetes.io/projected/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-kube-api-access-x8cfz\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.263677 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" (UID: "8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.350665 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.629532 5200 generic.go:358] "Generic (PLEG): container finished" podID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerID="c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963" exitCode=0 Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.629757 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5jrv" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.629814 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5jrv" event={"ID":"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf","Type":"ContainerDied","Data":"c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963"} Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.629960 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5jrv" event={"ID":"8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf","Type":"ContainerDied","Data":"56c2c29a3339c737a3eca029003afe3eaf338633d37c6a48f2d25db4269e71d6"} Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.630001 5200 scope.go:117] "RemoveContainer" containerID="c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.682444 5200 scope.go:117] "RemoveContainer" containerID="cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.691816 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5jrv"] Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.709538 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l5jrv"] Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.717570 5200 scope.go:117] "RemoveContainer" containerID="cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.746663 5200 scope.go:117] "RemoveContainer" containerID="c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963" Nov 26 14:11:21 crc kubenswrapper[5200]: E1126 14:11:21.747348 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963\": container with ID starting with c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963 not found: ID does not exist" containerID="c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.747411 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963"} err="failed to get container status \"c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963\": rpc error: code = NotFound desc = could not find container \"c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963\": container with ID starting with c71acd57215bce064aaafbbedbeeae343c863640a2f43b4c559bb2377796f963 not found: ID does not exist" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.747452 5200 scope.go:117] "RemoveContainer" containerID="cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3" Nov 26 14:11:21 crc kubenswrapper[5200]: E1126 14:11:21.748017 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3\": container with ID starting with cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3 not found: ID does not exist" containerID="cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.748065 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3"} err="failed to get container status \"cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3\": rpc error: code = NotFound desc = could not find container \"cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3\": container with ID starting with cf59313659c7a70d943331b77a1af90be60b73f8676ac79fbfa92eae3523dff3 not found: ID does not exist" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.748091 5200 scope.go:117] "RemoveContainer" containerID="cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7" Nov 26 14:11:21 crc kubenswrapper[5200]: E1126 14:11:21.748478 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7\": container with ID starting with cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7 not found: ID does not exist" containerID="cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7" Nov 26 14:11:21 crc kubenswrapper[5200]: I1126 14:11:21.748507 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7"} err="failed to get container status \"cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7\": rpc error: code = NotFound desc = could not find container \"cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7\": container with ID starting with cae5a8bdf8fe25c5e42ef755498a99f7d91490234a982389ea130cb1869dd8c7 not found: ID does not exist" Nov 26 14:11:22 crc kubenswrapper[5200]: I1126 14:11:22.614460 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc2mn"] Nov 26 14:11:22 crc kubenswrapper[5200]: I1126 14:11:22.648549 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dc2mn" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerName="registry-server" containerID="cri-o://09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e" gracePeriod=2 Nov 26 14:11:22 crc kubenswrapper[5200]: I1126 14:11:22.990307 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" path="/var/lib/kubelet/pods/8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf/volumes" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.140573 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.288952 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-utilities\") pod \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.289625 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc6kc\" (UniqueName: \"kubernetes.io/projected/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-kube-api-access-mc6kc\") pod \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.289932 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-catalog-content\") pod \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\" (UID: \"a452a36b-5afe-4f53-9d82-d5776a4b4aa8\") " Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.290581 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-utilities" (OuterVolumeSpecName: "utilities") pod "a452a36b-5afe-4f53-9d82-d5776a4b4aa8" (UID: "a452a36b-5afe-4f53-9d82-d5776a4b4aa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.298367 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-kube-api-access-mc6kc" (OuterVolumeSpecName: "kube-api-access-mc6kc") pod "a452a36b-5afe-4f53-9d82-d5776a4b4aa8" (UID: "a452a36b-5afe-4f53-9d82-d5776a4b4aa8"). InnerVolumeSpecName "kube-api-access-mc6kc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.325837 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a452a36b-5afe-4f53-9d82-d5776a4b4aa8" (UID: "a452a36b-5afe-4f53-9d82-d5776a4b4aa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.392834 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.392874 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mc6kc\" (UniqueName: \"kubernetes.io/projected/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-kube-api-access-mc6kc\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.392888 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a452a36b-5afe-4f53-9d82-d5776a4b4aa8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.667479 5200 generic.go:358] "Generic (PLEG): container finished" podID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerID="09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e" exitCode=0 Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.667678 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc2mn" event={"ID":"a452a36b-5afe-4f53-9d82-d5776a4b4aa8","Type":"ContainerDied","Data":"09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e"} Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.667768 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc2mn" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.667846 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc2mn" event={"ID":"a452a36b-5afe-4f53-9d82-d5776a4b4aa8","Type":"ContainerDied","Data":"8f32b9e073e2f238fd4d6ca38f6a9dcda31e8f1a0ac665e6fb9a45a0b24673da"} Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.667908 5200 scope.go:117] "RemoveContainer" containerID="09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.723181 5200 scope.go:117] "RemoveContainer" containerID="6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.745161 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc2mn"] Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.757662 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dc2mn"] Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.770123 5200 scope.go:117] "RemoveContainer" containerID="84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.804164 5200 scope.go:117] "RemoveContainer" containerID="09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e" Nov 26 14:11:23 crc kubenswrapper[5200]: E1126 14:11:23.805409 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e\": container with ID starting with 09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e not found: ID does not exist" containerID="09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.805522 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e"} err="failed to get container status \"09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e\": rpc error: code = NotFound desc = could not find container \"09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e\": container with ID starting with 09e3adfe928c8ca716f4e04099739c47f11e6c95006befea79d734fad341f14e not found: ID does not exist" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.805593 5200 scope.go:117] "RemoveContainer" containerID="6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882" Nov 26 14:11:23 crc kubenswrapper[5200]: E1126 14:11:23.806527 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882\": container with ID starting with 6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882 not found: ID does not exist" containerID="6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.806590 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882"} err="failed to get container status \"6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882\": rpc error: code = NotFound desc = could not find container \"6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882\": container with ID starting with 6a08cfd664682d3820e6e8d6d695df70773559d66cf02708a71c938802bab882 not found: ID does not exist" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.806642 5200 scope.go:117] "RemoveContainer" containerID="84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe" Nov 26 14:11:23 crc kubenswrapper[5200]: E1126 14:11:23.810861 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe\": container with ID starting with 84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe not found: ID does not exist" containerID="84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe" Nov 26 14:11:23 crc kubenswrapper[5200]: I1126 14:11:23.810921 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe"} err="failed to get container status \"84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe\": rpc error: code = NotFound desc = could not find container \"84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe\": container with ID starting with 84511368e1b02ffcc980e0635e5bc2d97ab892c8f86e57e6e50a5cb3ab219ffe not found: ID does not exist" Nov 26 14:11:24 crc kubenswrapper[5200]: I1126 14:11:24.987260 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" path="/var/lib/kubelet/pods/a452a36b-5afe-4f53-9d82-d5776a4b4aa8/volumes" Nov 26 14:11:35 crc kubenswrapper[5200]: I1126 14:11:35.969590 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:11:35 crc kubenswrapper[5200]: E1126 14:11:35.971189 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:11:42 crc kubenswrapper[5200]: E1126 14:11:42.159273 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169207 actualBytes=10240 Nov 26 14:11:48 crc kubenswrapper[5200]: I1126 14:11:48.983588 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:11:48 crc kubenswrapper[5200]: E1126 14:11:48.984480 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:11:59 crc kubenswrapper[5200]: I1126 14:11:59.970312 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:11:59 crc kubenswrapper[5200]: E1126 14:11:59.972006 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:12:12 crc kubenswrapper[5200]: I1126 14:12:12.980651 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:12:12 crc kubenswrapper[5200]: E1126 14:12:12.982628 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:12:25 crc kubenswrapper[5200]: I1126 14:12:25.969597 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:12:25 crc kubenswrapper[5200]: E1126 14:12:25.970549 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:12:39 crc kubenswrapper[5200]: I1126 14:12:39.969736 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:12:39 crc kubenswrapper[5200]: E1126 14:12:39.970957 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:12:42 crc kubenswrapper[5200]: E1126 14:12:42.102788 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167884 actualBytes=10240 Nov 26 14:12:50 crc kubenswrapper[5200]: I1126 14:12:50.970774 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:12:50 crc kubenswrapper[5200]: E1126 14:12:50.972133 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:13:01 crc kubenswrapper[5200]: I1126 14:13:01.970307 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:13:01 crc kubenswrapper[5200]: E1126 14:13:01.972013 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:13:15 crc kubenswrapper[5200]: I1126 14:13:15.151727 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:13:15 crc kubenswrapper[5200]: E1126 14:13:15.152812 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:13:27 crc kubenswrapper[5200]: I1126 14:13:27.969442 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:13:27 crc kubenswrapper[5200]: E1126 14:13:27.971954 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:13:39 crc kubenswrapper[5200]: I1126 14:13:39.969989 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:13:39 crc kubenswrapper[5200]: E1126 14:13:39.971313 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:13:42 crc kubenswrapper[5200]: E1126 14:13:42.194262 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167881 actualBytes=10240 Nov 26 14:13:50 crc kubenswrapper[5200]: I1126 14:13:50.969367 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:13:50 crc kubenswrapper[5200]: E1126 14:13:50.970704 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:14:01 crc kubenswrapper[5200]: I1126 14:14:01.970328 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:14:01 crc kubenswrapper[5200]: E1126 14:14:01.971729 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:14:13 crc kubenswrapper[5200]: I1126 14:14:13.970318 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:14:14 crc kubenswrapper[5200]: I1126 14:14:14.814527 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"3136610fc20ce137d83b927c16e32ed47a429458ed33fa245c226c7db0fa9e66"} Nov 26 14:14:42 crc kubenswrapper[5200]: E1126 14:14:42.471999 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167877 actualBytes=10240 Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.155493 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r"] Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158311 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerName="extract-content" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158347 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerName="extract-content" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158374 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158382 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158399 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158408 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158453 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerName="extract-utilities" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158465 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerName="extract-utilities" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158490 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158498 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158516 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerName="extract-content" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158524 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerName="extract-content" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158548 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerName="extract-content" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158558 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerName="extract-content" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158572 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerName="extract-utilities" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158582 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerName="extract-utilities" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158620 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerName="extract-utilities" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158629 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerName="extract-utilities" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158861 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8774e8da-6cdb-44e5-bf70-6ba4d3dfdadf" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158883 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="475d84ae-cd46-4a7c-a4cf-a0749f8e10f0" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.158904 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a452a36b-5afe-4f53-9d82-d5776a4b4aa8" containerName="registry-server" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.167406 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.169912 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.171392 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.180096 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r"] Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.227314 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lv77\" (UniqueName: \"kubernetes.io/projected/b745b09b-a67d-4aba-b374-fd012ef56f2c-kube-api-access-4lv77\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.227449 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b745b09b-a67d-4aba-b374-fd012ef56f2c-secret-volume\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.227495 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b745b09b-a67d-4aba-b374-fd012ef56f2c-config-volume\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.329064 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lv77\" (UniqueName: \"kubernetes.io/projected/b745b09b-a67d-4aba-b374-fd012ef56f2c-kube-api-access-4lv77\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.329128 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b745b09b-a67d-4aba-b374-fd012ef56f2c-secret-volume\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.329156 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b745b09b-a67d-4aba-b374-fd012ef56f2c-config-volume\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.330130 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b745b09b-a67d-4aba-b374-fd012ef56f2c-config-volume\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.337650 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b745b09b-a67d-4aba-b374-fd012ef56f2c-secret-volume\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.346512 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lv77\" (UniqueName: \"kubernetes.io/projected/b745b09b-a67d-4aba-b374-fd012ef56f2c-kube-api-access-4lv77\") pod \"collect-profiles-29402775-j5z6r\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:00 crc kubenswrapper[5200]: I1126 14:15:00.502548 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:01 crc kubenswrapper[5200]: I1126 14:15:01.022719 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r"] Nov 26 14:15:01 crc kubenswrapper[5200]: I1126 14:15:01.335447 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" event={"ID":"b745b09b-a67d-4aba-b374-fd012ef56f2c","Type":"ContainerStarted","Data":"596535f23a49d9f35e3bb8ce406a2787f8f5c41968363adc671a074720591869"} Nov 26 14:15:02 crc kubenswrapper[5200]: I1126 14:15:02.347979 5200 generic.go:358] "Generic (PLEG): container finished" podID="b745b09b-a67d-4aba-b374-fd012ef56f2c" containerID="1bad2ab490a2ef5d461b78889fc1f6972d9d09f4243c41b24b2ade67bd2f181e" exitCode=0 Nov 26 14:15:02 crc kubenswrapper[5200]: I1126 14:15:02.348071 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" event={"ID":"b745b09b-a67d-4aba-b374-fd012ef56f2c","Type":"ContainerDied","Data":"1bad2ab490a2ef5d461b78889fc1f6972d9d09f4243c41b24b2ade67bd2f181e"} Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.672944 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.791619 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lv77\" (UniqueName: \"kubernetes.io/projected/b745b09b-a67d-4aba-b374-fd012ef56f2c-kube-api-access-4lv77\") pod \"b745b09b-a67d-4aba-b374-fd012ef56f2c\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.791745 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b745b09b-a67d-4aba-b374-fd012ef56f2c-secret-volume\") pod \"b745b09b-a67d-4aba-b374-fd012ef56f2c\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.791892 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b745b09b-a67d-4aba-b374-fd012ef56f2c-config-volume\") pod \"b745b09b-a67d-4aba-b374-fd012ef56f2c\" (UID: \"b745b09b-a67d-4aba-b374-fd012ef56f2c\") " Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.792898 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b745b09b-a67d-4aba-b374-fd012ef56f2c-config-volume" (OuterVolumeSpecName: "config-volume") pod "b745b09b-a67d-4aba-b374-fd012ef56f2c" (UID: "b745b09b-a67d-4aba-b374-fd012ef56f2c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.798929 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b745b09b-a67d-4aba-b374-fd012ef56f2c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b745b09b-a67d-4aba-b374-fd012ef56f2c" (UID: "b745b09b-a67d-4aba-b374-fd012ef56f2c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.799728 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b745b09b-a67d-4aba-b374-fd012ef56f2c-kube-api-access-4lv77" (OuterVolumeSpecName: "kube-api-access-4lv77") pod "b745b09b-a67d-4aba-b374-fd012ef56f2c" (UID: "b745b09b-a67d-4aba-b374-fd012ef56f2c"). InnerVolumeSpecName "kube-api-access-4lv77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.893836 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b745b09b-a67d-4aba-b374-fd012ef56f2c-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.893885 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4lv77\" (UniqueName: \"kubernetes.io/projected/b745b09b-a67d-4aba-b374-fd012ef56f2c-kube-api-access-4lv77\") on node \"crc\" DevicePath \"\"" Nov 26 14:15:03 crc kubenswrapper[5200]: I1126 14:15:03.893898 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b745b09b-a67d-4aba-b374-fd012ef56f2c-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:15:04 crc kubenswrapper[5200]: I1126 14:15:04.367715 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" event={"ID":"b745b09b-a67d-4aba-b374-fd012ef56f2c","Type":"ContainerDied","Data":"596535f23a49d9f35e3bb8ce406a2787f8f5c41968363adc671a074720591869"} Nov 26 14:15:04 crc kubenswrapper[5200]: I1126 14:15:04.367755 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="596535f23a49d9f35e3bb8ce406a2787f8f5c41968363adc671a074720591869" Nov 26 14:15:04 crc kubenswrapper[5200]: I1126 14:15:04.367805 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r" Nov 26 14:15:04 crc kubenswrapper[5200]: I1126 14:15:04.756097 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr"] Nov 26 14:15:04 crc kubenswrapper[5200]: I1126 14:15:04.762789 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402730-z2qtr"] Nov 26 14:15:04 crc kubenswrapper[5200]: I1126 14:15:04.992541 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d5f3d3-1ccb-4a17-88ba-c76273c02a8b" path="/var/lib/kubelet/pods/55d5f3d3-1ccb-4a17-88ba-c76273c02a8b/volumes" Nov 26 14:15:26 crc kubenswrapper[5200]: I1126 14:15:26.012244 5200 scope.go:117] "RemoveContainer" containerID="59c7ab3a4317aad68d35814b5556ab11f38fcfec9ed49926ea289412ef6f7b29" Nov 26 14:15:35 crc kubenswrapper[5200]: I1126 14:15:35.811440 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:15:35 crc kubenswrapper[5200]: I1126 14:15:35.821135 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:15:35 crc kubenswrapper[5200]: I1126 14:15:35.829884 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:15:35 crc kubenswrapper[5200]: I1126 14:15:35.837122 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:15:41 crc kubenswrapper[5200]: E1126 14:15:41.822211 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167876 actualBytes=10240 Nov 26 14:16:33 crc kubenswrapper[5200]: I1126 14:16:33.154293 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:16:33 crc kubenswrapper[5200]: I1126 14:16:33.155086 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:16:42 crc kubenswrapper[5200]: E1126 14:16:42.080522 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167876 actualBytes=10240 Nov 26 14:17:03 crc kubenswrapper[5200]: I1126 14:17:03.154659 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:17:03 crc kubenswrapper[5200]: I1126 14:17:03.155439 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:17:33 crc kubenswrapper[5200]: I1126 14:17:33.154537 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:17:33 crc kubenswrapper[5200]: I1126 14:17:33.155513 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:17:33 crc kubenswrapper[5200]: I1126 14:17:33.155595 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:17:33 crc kubenswrapper[5200]: I1126 14:17:33.156622 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3136610fc20ce137d83b927c16e32ed47a429458ed33fa245c226c7db0fa9e66"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:17:33 crc kubenswrapper[5200]: I1126 14:17:33.156817 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://3136610fc20ce137d83b927c16e32ed47a429458ed33fa245c226c7db0fa9e66" gracePeriod=600 Nov 26 14:17:33 crc kubenswrapper[5200]: I1126 14:17:33.290141 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:17:34 crc kubenswrapper[5200]: I1126 14:17:34.140214 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="3136610fc20ce137d83b927c16e32ed47a429458ed33fa245c226c7db0fa9e66" exitCode=0 Nov 26 14:17:34 crc kubenswrapper[5200]: I1126 14:17:34.140253 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"3136610fc20ce137d83b927c16e32ed47a429458ed33fa245c226c7db0fa9e66"} Nov 26 14:17:34 crc kubenswrapper[5200]: I1126 14:17:34.141348 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0"} Nov 26 14:17:34 crc kubenswrapper[5200]: I1126 14:17:34.141375 5200 scope.go:117] "RemoveContainer" containerID="cf5ba9ef7e35517ae7d727032f1df45d73ccbc9fa1c2d9fe972a03eb61be334b" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.676771 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jr9tf"] Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.681357 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b745b09b-a67d-4aba-b374-fd012ef56f2c" containerName="collect-profiles" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.681421 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b745b09b-a67d-4aba-b374-fd012ef56f2c" containerName="collect-profiles" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.682268 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b745b09b-a67d-4aba-b374-fd012ef56f2c" containerName="collect-profiles" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.696542 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jr9tf"] Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.696801 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.809039 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rp7z\" (UniqueName: \"kubernetes.io/projected/44ae2c1c-8850-4b8c-886e-2aa892e70975-kube-api-access-4rp7z\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.809139 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-catalog-content\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.809350 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-utilities\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.911529 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-catalog-content\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.911612 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-utilities\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.911733 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rp7z\" (UniqueName: \"kubernetes.io/projected/44ae2c1c-8850-4b8c-886e-2aa892e70975-kube-api-access-4rp7z\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.912668 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-utilities\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.912975 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-catalog-content\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:36 crc kubenswrapper[5200]: I1126 14:17:36.952624 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rp7z\" (UniqueName: \"kubernetes.io/projected/44ae2c1c-8850-4b8c-886e-2aa892e70975-kube-api-access-4rp7z\") pod \"community-operators-jr9tf\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:37 crc kubenswrapper[5200]: I1126 14:17:37.034282 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:37 crc kubenswrapper[5200]: I1126 14:17:37.559061 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jr9tf"] Nov 26 14:17:38 crc kubenswrapper[5200]: I1126 14:17:38.203580 5200 generic.go:358] "Generic (PLEG): container finished" podID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerID="a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3" exitCode=0 Nov 26 14:17:38 crc kubenswrapper[5200]: I1126 14:17:38.204082 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr9tf" event={"ID":"44ae2c1c-8850-4b8c-886e-2aa892e70975","Type":"ContainerDied","Data":"a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3"} Nov 26 14:17:38 crc kubenswrapper[5200]: I1126 14:17:38.204122 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr9tf" event={"ID":"44ae2c1c-8850-4b8c-886e-2aa892e70975","Type":"ContainerStarted","Data":"3a3b25ac921924bc96d56a6ceff9ca0f6e14daff419f8124f3a1acf1cd80a167"} Nov 26 14:17:40 crc kubenswrapper[5200]: I1126 14:17:40.230260 5200 generic.go:358] "Generic (PLEG): container finished" podID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerID="07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b" exitCode=0 Nov 26 14:17:40 crc kubenswrapper[5200]: I1126 14:17:40.230385 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr9tf" event={"ID":"44ae2c1c-8850-4b8c-886e-2aa892e70975","Type":"ContainerDied","Data":"07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b"} Nov 26 14:17:41 crc kubenswrapper[5200]: I1126 14:17:41.242796 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr9tf" event={"ID":"44ae2c1c-8850-4b8c-886e-2aa892e70975","Type":"ContainerStarted","Data":"93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908"} Nov 26 14:17:41 crc kubenswrapper[5200]: I1126 14:17:41.268082 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jr9tf" podStartSLOduration=4.005741383 podStartE2EDuration="5.268048138s" podCreationTimestamp="2025-11-26 14:17:36 +0000 UTC" firstStartedPulling="2025-11-26 14:17:38.206982937 +0000 UTC m=+2826.886184755" lastFinishedPulling="2025-11-26 14:17:39.469289662 +0000 UTC m=+2828.148491510" observedRunningTime="2025-11-26 14:17:41.265018909 +0000 UTC m=+2829.944220797" watchObservedRunningTime="2025-11-26 14:17:41.268048138 +0000 UTC m=+2829.947249956" Nov 26 14:17:42 crc kubenswrapper[5200]: E1126 14:17:42.350151 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1177247 actualBytes=10240 Nov 26 14:17:47 crc kubenswrapper[5200]: I1126 14:17:47.035590 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:47 crc kubenswrapper[5200]: I1126 14:17:47.036483 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:47 crc kubenswrapper[5200]: I1126 14:17:47.082492 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:47 crc kubenswrapper[5200]: I1126 14:17:47.369297 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:47 crc kubenswrapper[5200]: I1126 14:17:47.445203 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jr9tf"] Nov 26 14:17:49 crc kubenswrapper[5200]: I1126 14:17:49.331456 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jr9tf" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerName="registry-server" containerID="cri-o://93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908" gracePeriod=2 Nov 26 14:17:49 crc kubenswrapper[5200]: I1126 14:17:49.843942 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:49 crc kubenswrapper[5200]: I1126 14:17:49.965374 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-catalog-content\") pod \"44ae2c1c-8850-4b8c-886e-2aa892e70975\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " Nov 26 14:17:49 crc kubenswrapper[5200]: I1126 14:17:49.965554 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-utilities\") pod \"44ae2c1c-8850-4b8c-886e-2aa892e70975\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " Nov 26 14:17:49 crc kubenswrapper[5200]: I1126 14:17:49.967599 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-utilities" (OuterVolumeSpecName: "utilities") pod "44ae2c1c-8850-4b8c-886e-2aa892e70975" (UID: "44ae2c1c-8850-4b8c-886e-2aa892e70975"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:17:49 crc kubenswrapper[5200]: I1126 14:17:49.968217 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rp7z\" (UniqueName: \"kubernetes.io/projected/44ae2c1c-8850-4b8c-886e-2aa892e70975-kube-api-access-4rp7z\") pod \"44ae2c1c-8850-4b8c-886e-2aa892e70975\" (UID: \"44ae2c1c-8850-4b8c-886e-2aa892e70975\") " Nov 26 14:17:49 crc kubenswrapper[5200]: I1126 14:17:49.968948 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:17:49 crc kubenswrapper[5200]: I1126 14:17:49.981401 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ae2c1c-8850-4b8c-886e-2aa892e70975-kube-api-access-4rp7z" (OuterVolumeSpecName: "kube-api-access-4rp7z") pod "44ae2c1c-8850-4b8c-886e-2aa892e70975" (UID: "44ae2c1c-8850-4b8c-886e-2aa892e70975"). InnerVolumeSpecName "kube-api-access-4rp7z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.073183 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4rp7z\" (UniqueName: \"kubernetes.io/projected/44ae2c1c-8850-4b8c-886e-2aa892e70975-kube-api-access-4rp7z\") on node \"crc\" DevicePath \"\"" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.352192 5200 generic.go:358] "Generic (PLEG): container finished" podID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerID="93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908" exitCode=0 Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.352289 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jr9tf" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.352311 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr9tf" event={"ID":"44ae2c1c-8850-4b8c-886e-2aa892e70975","Type":"ContainerDied","Data":"93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908"} Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.352378 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr9tf" event={"ID":"44ae2c1c-8850-4b8c-886e-2aa892e70975","Type":"ContainerDied","Data":"3a3b25ac921924bc96d56a6ceff9ca0f6e14daff419f8124f3a1acf1cd80a167"} Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.352413 5200 scope.go:117] "RemoveContainer" containerID="93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.387475 5200 scope.go:117] "RemoveContainer" containerID="07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.423864 5200 scope.go:117] "RemoveContainer" containerID="a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.463897 5200 scope.go:117] "RemoveContainer" containerID="93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908" Nov 26 14:17:50 crc kubenswrapper[5200]: E1126 14:17:50.464800 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908\": container with ID starting with 93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908 not found: ID does not exist" containerID="93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.465149 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908"} err="failed to get container status \"93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908\": rpc error: code = NotFound desc = could not find container \"93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908\": container with ID starting with 93c63d04d7a465ab7792df55f8851d157461f0acdf5cec33c7bb912f4bdcf908 not found: ID does not exist" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.465422 5200 scope.go:117] "RemoveContainer" containerID="07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b" Nov 26 14:17:50 crc kubenswrapper[5200]: E1126 14:17:50.466305 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b\": container with ID starting with 07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b not found: ID does not exist" containerID="07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.466495 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b"} err="failed to get container status \"07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b\": rpc error: code = NotFound desc = could not find container \"07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b\": container with ID starting with 07fed5db65d37fff0371717f6bcd1cb924d34612a1a6b5e9981efbfe7f25333b not found: ID does not exist" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.466536 5200 scope.go:117] "RemoveContainer" containerID="a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3" Nov 26 14:17:50 crc kubenswrapper[5200]: E1126 14:17:50.467049 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3\": container with ID starting with a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3 not found: ID does not exist" containerID="a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.467268 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3"} err="failed to get container status \"a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3\": rpc error: code = NotFound desc = could not find container \"a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3\": container with ID starting with a422defcceb2fa5a56f87a2e97dabb580346733960b1891c1a7c1eaf5231eed3 not found: ID does not exist" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.480848 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44ae2c1c-8850-4b8c-886e-2aa892e70975" (UID: "44ae2c1c-8850-4b8c-886e-2aa892e70975"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.582079 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ae2c1c-8850-4b8c-886e-2aa892e70975-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.726282 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jr9tf"] Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.737001 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jr9tf"] Nov 26 14:17:50 crc kubenswrapper[5200]: I1126 14:17:50.982076 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" path="/var/lib/kubelet/pods/44ae2c1c-8850-4b8c-886e-2aa892e70975/volumes" Nov 26 14:18:42 crc kubenswrapper[5200]: E1126 14:18:42.429514 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169205 actualBytes=10240 Nov 26 14:19:33 crc kubenswrapper[5200]: I1126 14:19:33.155208 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:19:33 crc kubenswrapper[5200]: I1126 14:19:33.156206 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:19:42 crc kubenswrapper[5200]: E1126 14:19:42.338040 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169202 actualBytes=10240 Nov 26 14:20:03 crc kubenswrapper[5200]: I1126 14:20:03.154402 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:20:03 crc kubenswrapper[5200]: I1126 14:20:03.155401 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:20:33 crc kubenswrapper[5200]: I1126 14:20:33.154668 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:20:33 crc kubenswrapper[5200]: I1126 14:20:33.155947 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:20:33 crc kubenswrapper[5200]: I1126 14:20:33.156028 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:20:33 crc kubenswrapper[5200]: I1126 14:20:33.157109 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:20:33 crc kubenswrapper[5200]: I1126 14:20:33.157223 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" gracePeriod=600 Nov 26 14:20:33 crc kubenswrapper[5200]: E1126 14:20:33.296900 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:20:34 crc kubenswrapper[5200]: I1126 14:20:34.119238 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" exitCode=0 Nov 26 14:20:34 crc kubenswrapper[5200]: I1126 14:20:34.120040 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0"} Nov 26 14:20:34 crc kubenswrapper[5200]: I1126 14:20:34.120111 5200 scope.go:117] "RemoveContainer" containerID="3136610fc20ce137d83b927c16e32ed47a429458ed33fa245c226c7db0fa9e66" Nov 26 14:20:34 crc kubenswrapper[5200]: I1126 14:20:34.120853 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:20:34 crc kubenswrapper[5200]: E1126 14:20:34.121274 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:20:35 crc kubenswrapper[5200]: I1126 14:20:35.975279 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:20:35 crc kubenswrapper[5200]: I1126 14:20:35.976123 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:20:35 crc kubenswrapper[5200]: I1126 14:20:35.981203 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:20:35 crc kubenswrapper[5200]: I1126 14:20:35.981289 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:20:42 crc kubenswrapper[5200]: E1126 14:20:42.254052 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167878 actualBytes=10240 Nov 26 14:20:44 crc kubenswrapper[5200]: I1126 14:20:44.970296 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:20:44 crc kubenswrapper[5200]: E1126 14:20:44.971592 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:20:56 crc kubenswrapper[5200]: I1126 14:20:56.970288 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:20:56 crc kubenswrapper[5200]: E1126 14:20:56.971438 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:21:10 crc kubenswrapper[5200]: I1126 14:21:10.982756 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:21:10 crc kubenswrapper[5200]: E1126 14:21:10.984980 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:21:21 crc kubenswrapper[5200]: I1126 14:21:21.969193 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:21:21 crc kubenswrapper[5200]: E1126 14:21:21.970202 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:21:34 crc kubenswrapper[5200]: I1126 14:21:34.970357 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:21:34 crc kubenswrapper[5200]: E1126 14:21:34.971713 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.245167 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-swksb"] Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.246856 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerName="extract-content" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.246877 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerName="extract-content" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.246897 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerName="extract-utilities" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.246905 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerName="extract-utilities" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.246921 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerName="registry-server" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.246930 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerName="registry-server" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.247096 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="44ae2c1c-8850-4b8c-886e-2aa892e70975" containerName="registry-server" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.271392 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swksb"] Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.271591 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.363044 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-catalog-content\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.363131 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-utilities\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.363191 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz58t\" (UniqueName: \"kubernetes.io/projected/b137e0aa-5092-4986-b2e4-2b275430203a-kube-api-access-zz58t\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.464556 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-utilities\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.464670 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz58t\" (UniqueName: \"kubernetes.io/projected/b137e0aa-5092-4986-b2e4-2b275430203a-kube-api-access-zz58t\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.464762 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-catalog-content\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.465267 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-catalog-content\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.465538 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-utilities\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.489263 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz58t\" (UniqueName: \"kubernetes.io/projected/b137e0aa-5092-4986-b2e4-2b275430203a-kube-api-access-zz58t\") pod \"certified-operators-swksb\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:37 crc kubenswrapper[5200]: I1126 14:21:37.590269 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:38 crc kubenswrapper[5200]: I1126 14:21:38.087416 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swksb"] Nov 26 14:21:38 crc kubenswrapper[5200]: I1126 14:21:38.744832 5200 generic.go:358] "Generic (PLEG): container finished" podID="b137e0aa-5092-4986-b2e4-2b275430203a" containerID="53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199" exitCode=0 Nov 26 14:21:38 crc kubenswrapper[5200]: I1126 14:21:38.744917 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swksb" event={"ID":"b137e0aa-5092-4986-b2e4-2b275430203a","Type":"ContainerDied","Data":"53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199"} Nov 26 14:21:38 crc kubenswrapper[5200]: I1126 14:21:38.745576 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swksb" event={"ID":"b137e0aa-5092-4986-b2e4-2b275430203a","Type":"ContainerStarted","Data":"9d0e76f2cbf6a6cb6e60145c37fb54881bb78e751d58196ece2ae1ec00d8cc19"} Nov 26 14:21:40 crc kubenswrapper[5200]: I1126 14:21:40.764431 5200 generic.go:358] "Generic (PLEG): container finished" podID="b137e0aa-5092-4986-b2e4-2b275430203a" containerID="eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372" exitCode=0 Nov 26 14:21:40 crc kubenswrapper[5200]: I1126 14:21:40.764570 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swksb" event={"ID":"b137e0aa-5092-4986-b2e4-2b275430203a","Type":"ContainerDied","Data":"eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372"} Nov 26 14:21:41 crc kubenswrapper[5200]: I1126 14:21:41.777202 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swksb" event={"ID":"b137e0aa-5092-4986-b2e4-2b275430203a","Type":"ContainerStarted","Data":"e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566"} Nov 26 14:21:41 crc kubenswrapper[5200]: I1126 14:21:41.807083 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-swksb" podStartSLOduration=3.419173388 podStartE2EDuration="4.807060028s" podCreationTimestamp="2025-11-26 14:21:37 +0000 UTC" firstStartedPulling="2025-11-26 14:21:38.747112103 +0000 UTC m=+3067.426313971" lastFinishedPulling="2025-11-26 14:21:40.134998783 +0000 UTC m=+3068.814200611" observedRunningTime="2025-11-26 14:21:41.805268441 +0000 UTC m=+3070.484470319" watchObservedRunningTime="2025-11-26 14:21:41.807060028 +0000 UTC m=+3070.486261856" Nov 26 14:21:42 crc kubenswrapper[5200]: E1126 14:21:42.186385 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1175927 actualBytes=10240 Nov 26 14:21:47 crc kubenswrapper[5200]: I1126 14:21:47.591332 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:47 crc kubenswrapper[5200]: I1126 14:21:47.591867 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:47 crc kubenswrapper[5200]: I1126 14:21:47.639859 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:47 crc kubenswrapper[5200]: I1126 14:21:47.897040 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:47 crc kubenswrapper[5200]: I1126 14:21:47.949896 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swksb"] Nov 26 14:21:48 crc kubenswrapper[5200]: I1126 14:21:48.969970 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:21:48 crc kubenswrapper[5200]: E1126 14:21:48.970531 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:21:49 crc kubenswrapper[5200]: I1126 14:21:49.855835 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-swksb" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" containerName="registry-server" containerID="cri-o://e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566" gracePeriod=2 Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.327904 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.399035 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-utilities\") pod \"b137e0aa-5092-4986-b2e4-2b275430203a\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.399207 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-catalog-content\") pod \"b137e0aa-5092-4986-b2e4-2b275430203a\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.399244 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz58t\" (UniqueName: \"kubernetes.io/projected/b137e0aa-5092-4986-b2e4-2b275430203a-kube-api-access-zz58t\") pod \"b137e0aa-5092-4986-b2e4-2b275430203a\" (UID: \"b137e0aa-5092-4986-b2e4-2b275430203a\") " Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.401869 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-utilities" (OuterVolumeSpecName: "utilities") pod "b137e0aa-5092-4986-b2e4-2b275430203a" (UID: "b137e0aa-5092-4986-b2e4-2b275430203a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.412467 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b137e0aa-5092-4986-b2e4-2b275430203a-kube-api-access-zz58t" (OuterVolumeSpecName: "kube-api-access-zz58t") pod "b137e0aa-5092-4986-b2e4-2b275430203a" (UID: "b137e0aa-5092-4986-b2e4-2b275430203a"). InnerVolumeSpecName "kube-api-access-zz58t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.447806 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b137e0aa-5092-4986-b2e4-2b275430203a" (UID: "b137e0aa-5092-4986-b2e4-2b275430203a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.501677 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.501998 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b137e0aa-5092-4986-b2e4-2b275430203a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.502013 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zz58t\" (UniqueName: \"kubernetes.io/projected/b137e0aa-5092-4986-b2e4-2b275430203a-kube-api-access-zz58t\") on node \"crc\" DevicePath \"\"" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.870656 5200 generic.go:358] "Generic (PLEG): container finished" podID="b137e0aa-5092-4986-b2e4-2b275430203a" containerID="e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566" exitCode=0 Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.870736 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swksb" event={"ID":"b137e0aa-5092-4986-b2e4-2b275430203a","Type":"ContainerDied","Data":"e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566"} Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.870823 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swksb" event={"ID":"b137e0aa-5092-4986-b2e4-2b275430203a","Type":"ContainerDied","Data":"9d0e76f2cbf6a6cb6e60145c37fb54881bb78e751d58196ece2ae1ec00d8cc19"} Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.870863 5200 scope.go:117] "RemoveContainer" containerID="e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.870865 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swksb" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.904206 5200 scope.go:117] "RemoveContainer" containerID="eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.918078 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swksb"] Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.928032 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-swksb"] Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.949198 5200 scope.go:117] "RemoveContainer" containerID="53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.985423 5200 scope.go:117] "RemoveContainer" containerID="e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.985667 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" path="/var/lib/kubelet/pods/b137e0aa-5092-4986-b2e4-2b275430203a/volumes" Nov 26 14:21:50 crc kubenswrapper[5200]: E1126 14:21:50.985812 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566\": container with ID starting with e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566 not found: ID does not exist" containerID="e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.985862 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566"} err="failed to get container status \"e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566\": rpc error: code = NotFound desc = could not find container \"e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566\": container with ID starting with e08778e6b5d93ce34e67ccae70a37e3f2cee310278c7495533e403f8393a0566 not found: ID does not exist" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.985895 5200 scope.go:117] "RemoveContainer" containerID="eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372" Nov 26 14:21:50 crc kubenswrapper[5200]: E1126 14:21:50.986213 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372\": container with ID starting with eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372 not found: ID does not exist" containerID="eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.986233 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372"} err="failed to get container status \"eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372\": rpc error: code = NotFound desc = could not find container \"eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372\": container with ID starting with eaa969dc5e691cfe3bfb52cf625a4568953eaa0c3d6aacaa56d742b1cde27372 not found: ID does not exist" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.986256 5200 scope.go:117] "RemoveContainer" containerID="53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199" Nov 26 14:21:50 crc kubenswrapper[5200]: E1126 14:21:50.986567 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199\": container with ID starting with 53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199 not found: ID does not exist" containerID="53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199" Nov 26 14:21:50 crc kubenswrapper[5200]: I1126 14:21:50.986591 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199"} err="failed to get container status \"53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199\": rpc error: code = NotFound desc = could not find container \"53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199\": container with ID starting with 53f2cb92383c04a62ee41af15ed4f02dbdf6ec248ba7eb674e22420349aa2199 not found: ID does not exist" Nov 26 14:22:03 crc kubenswrapper[5200]: I1126 14:22:03.969970 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:22:03 crc kubenswrapper[5200]: E1126 14:22:03.971422 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:22:17 crc kubenswrapper[5200]: I1126 14:22:17.975252 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:22:17 crc kubenswrapper[5200]: E1126 14:22:17.976507 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.131776 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zrk9s"] Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.133676 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" containerName="extract-utilities" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.133710 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" containerName="extract-utilities" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.133731 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" containerName="extract-content" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.133738 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" containerName="extract-content" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.133767 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" containerName="registry-server" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.133774 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" containerName="registry-server" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.133965 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b137e0aa-5092-4986-b2e4-2b275430203a" containerName="registry-server" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.305263 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrk9s"] Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.305488 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.426150 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-utilities\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.426481 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-catalog-content\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.426504 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69t2\" (UniqueName: \"kubernetes.io/projected/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-kube-api-access-q69t2\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.527783 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-utilities\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.527832 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-catalog-content\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.527876 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q69t2\" (UniqueName: \"kubernetes.io/projected/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-kube-api-access-q69t2\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.528492 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-utilities\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.528643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-catalog-content\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.553101 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69t2\" (UniqueName: \"kubernetes.io/projected/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-kube-api-access-q69t2\") pod \"redhat-operators-zrk9s\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:23 crc kubenswrapper[5200]: I1126 14:22:23.626029 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:24 crc kubenswrapper[5200]: I1126 14:22:24.102387 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrk9s"] Nov 26 14:22:24 crc kubenswrapper[5200]: I1126 14:22:24.205280 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrk9s" event={"ID":"e5f4604e-26e1-488d-9856-3bc99ddc5ec1","Type":"ContainerStarted","Data":"a1aeb6aa5517ded90bf7583320406790267315a15860eaa9628e1c4d189add55"} Nov 26 14:22:25 crc kubenswrapper[5200]: I1126 14:22:25.226045 5200 generic.go:358] "Generic (PLEG): container finished" podID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerID="346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592" exitCode=0 Nov 26 14:22:25 crc kubenswrapper[5200]: I1126 14:22:25.226120 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrk9s" event={"ID":"e5f4604e-26e1-488d-9856-3bc99ddc5ec1","Type":"ContainerDied","Data":"346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592"} Nov 26 14:22:27 crc kubenswrapper[5200]: I1126 14:22:27.251433 5200 generic.go:358] "Generic (PLEG): container finished" podID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerID="a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6" exitCode=0 Nov 26 14:22:27 crc kubenswrapper[5200]: I1126 14:22:27.251514 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrk9s" event={"ID":"e5f4604e-26e1-488d-9856-3bc99ddc5ec1","Type":"ContainerDied","Data":"a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6"} Nov 26 14:22:28 crc kubenswrapper[5200]: I1126 14:22:28.261936 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrk9s" event={"ID":"e5f4604e-26e1-488d-9856-3bc99ddc5ec1","Type":"ContainerStarted","Data":"4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39"} Nov 26 14:22:28 crc kubenswrapper[5200]: I1126 14:22:28.286589 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zrk9s" podStartSLOduration=4.333520921 podStartE2EDuration="5.286561503s" podCreationTimestamp="2025-11-26 14:22:23 +0000 UTC" firstStartedPulling="2025-11-26 14:22:25.227251105 +0000 UTC m=+3113.906452923" lastFinishedPulling="2025-11-26 14:22:26.180291687 +0000 UTC m=+3114.859493505" observedRunningTime="2025-11-26 14:22:28.276592731 +0000 UTC m=+3116.955794559" watchObservedRunningTime="2025-11-26 14:22:28.286561503 +0000 UTC m=+3116.965763331" Nov 26 14:22:31 crc kubenswrapper[5200]: I1126 14:22:31.969107 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:22:31 crc kubenswrapper[5200]: E1126 14:22:31.969958 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:22:33 crc kubenswrapper[5200]: I1126 14:22:33.626561 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:33 crc kubenswrapper[5200]: I1126 14:22:33.626628 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:33 crc kubenswrapper[5200]: I1126 14:22:33.688131 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:34 crc kubenswrapper[5200]: I1126 14:22:34.358369 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:34 crc kubenswrapper[5200]: I1126 14:22:34.411104 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrk9s"] Nov 26 14:22:36 crc kubenswrapper[5200]: I1126 14:22:36.330177 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zrk9s" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerName="registry-server" containerID="cri-o://4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39" gracePeriod=2 Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.326307 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.342139 5200 generic.go:358] "Generic (PLEG): container finished" podID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerID="4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39" exitCode=0 Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.342265 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrk9s" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.342267 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrk9s" event={"ID":"e5f4604e-26e1-488d-9856-3bc99ddc5ec1","Type":"ContainerDied","Data":"4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39"} Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.342355 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrk9s" event={"ID":"e5f4604e-26e1-488d-9856-3bc99ddc5ec1","Type":"ContainerDied","Data":"a1aeb6aa5517ded90bf7583320406790267315a15860eaa9628e1c4d189add55"} Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.342397 5200 scope.go:117] "RemoveContainer" containerID="4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.376275 5200 scope.go:117] "RemoveContainer" containerID="a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.397831 5200 scope.go:117] "RemoveContainer" containerID="346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.426794 5200 scope.go:117] "RemoveContainer" containerID="4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39" Nov 26 14:22:37 crc kubenswrapper[5200]: E1126 14:22:37.427422 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39\": container with ID starting with 4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39 not found: ID does not exist" containerID="4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.427476 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39"} err="failed to get container status \"4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39\": rpc error: code = NotFound desc = could not find container \"4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39\": container with ID starting with 4be88cae36a87e36448399b393b625e45ff70a4250701eb5616f37acdf1a5e39 not found: ID does not exist" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.427508 5200 scope.go:117] "RemoveContainer" containerID="a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6" Nov 26 14:22:37 crc kubenswrapper[5200]: E1126 14:22:37.428052 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6\": container with ID starting with a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6 not found: ID does not exist" containerID="a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.428119 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6"} err="failed to get container status \"a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6\": rpc error: code = NotFound desc = could not find container \"a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6\": container with ID starting with a276a3b3f82c73831d73848f424c3ec72495679ca7b2454a6f72d1e8d0acd4b6 not found: ID does not exist" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.428161 5200 scope.go:117] "RemoveContainer" containerID="346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592" Nov 26 14:22:37 crc kubenswrapper[5200]: E1126 14:22:37.428637 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592\": container with ID starting with 346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592 not found: ID does not exist" containerID="346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.428670 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592"} err="failed to get container status \"346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592\": rpc error: code = NotFound desc = could not find container \"346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592\": container with ID starting with 346332203c7db7f7453a1fa3d030385d8a42b6c3548e8b90628c64649ed3d592 not found: ID does not exist" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.467277 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q69t2\" (UniqueName: \"kubernetes.io/projected/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-kube-api-access-q69t2\") pod \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.467366 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-utilities\") pod \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.467489 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-catalog-content\") pod \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\" (UID: \"e5f4604e-26e1-488d-9856-3bc99ddc5ec1\") " Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.468997 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-utilities" (OuterVolumeSpecName: "utilities") pod "e5f4604e-26e1-488d-9856-3bc99ddc5ec1" (UID: "e5f4604e-26e1-488d-9856-3bc99ddc5ec1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.475878 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-kube-api-access-q69t2" (OuterVolumeSpecName: "kube-api-access-q69t2") pod "e5f4604e-26e1-488d-9856-3bc99ddc5ec1" (UID: "e5f4604e-26e1-488d-9856-3bc99ddc5ec1"). InnerVolumeSpecName "kube-api-access-q69t2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.568991 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q69t2\" (UniqueName: \"kubernetes.io/projected/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-kube-api-access-q69t2\") on node \"crc\" DevicePath \"\"" Nov 26 14:22:37 crc kubenswrapper[5200]: I1126 14:22:37.569042 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:22:38 crc kubenswrapper[5200]: I1126 14:22:38.309140 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5f4604e-26e1-488d-9856-3bc99ddc5ec1" (UID: "e5f4604e-26e1-488d-9856-3bc99ddc5ec1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:22:38 crc kubenswrapper[5200]: I1126 14:22:38.382554 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5f4604e-26e1-488d-9856-3bc99ddc5ec1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:22:38 crc kubenswrapper[5200]: I1126 14:22:38.606625 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrk9s"] Nov 26 14:22:38 crc kubenswrapper[5200]: I1126 14:22:38.614562 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zrk9s"] Nov 26 14:22:38 crc kubenswrapper[5200]: I1126 14:22:38.985130 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" path="/var/lib/kubelet/pods/e5f4604e-26e1-488d-9856-3bc99ddc5ec1/volumes" Nov 26 14:22:41 crc kubenswrapper[5200]: E1126 14:22:41.866979 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167882 actualBytes=10240 Nov 26 14:22:46 crc kubenswrapper[5200]: I1126 14:22:46.969546 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:22:46 crc kubenswrapper[5200]: E1126 14:22:46.970571 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:23:01 crc kubenswrapper[5200]: I1126 14:23:01.970742 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:23:01 crc kubenswrapper[5200]: E1126 14:23:01.972331 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:23:14 crc kubenswrapper[5200]: I1126 14:23:14.969273 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:23:14 crc kubenswrapper[5200]: E1126 14:23:14.970370 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:23:25 crc kubenswrapper[5200]: I1126 14:23:25.971616 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:23:25 crc kubenswrapper[5200]: E1126 14:23:25.973211 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.656991 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dc785"] Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.661371 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerName="extract-content" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.661431 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerName="extract-content" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.661471 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerName="registry-server" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.661488 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerName="registry-server" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.661651 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerName="extract-utilities" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.661670 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerName="extract-utilities" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.662920 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5f4604e-26e1-488d-9856-3bc99ddc5ec1" containerName="registry-server" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.860808 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dc785"] Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.861018 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.910469 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8cb7\" (UniqueName: \"kubernetes.io/projected/784c38fa-15e5-4430-908d-39b1d74763b1-kube-api-access-g8cb7\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.910520 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-utilities\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:27 crc kubenswrapper[5200]: I1126 14:23:27.910624 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-catalog-content\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.012590 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-catalog-content\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.012806 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8cb7\" (UniqueName: \"kubernetes.io/projected/784c38fa-15e5-4430-908d-39b1d74763b1-kube-api-access-g8cb7\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.012849 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-utilities\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.013723 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-utilities\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.013784 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-catalog-content\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.038349 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8cb7\" (UniqueName: \"kubernetes.io/projected/784c38fa-15e5-4430-908d-39b1d74763b1-kube-api-access-g8cb7\") pod \"redhat-marketplace-dc785\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.180865 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.633495 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dc785"] Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.640509 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:23:28 crc kubenswrapper[5200]: I1126 14:23:28.860678 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dc785" event={"ID":"784c38fa-15e5-4430-908d-39b1d74763b1","Type":"ContainerStarted","Data":"7af48ee893daf651140bd636da1a370af501c855277fcbb411284e71a8813278"} Nov 26 14:23:29 crc kubenswrapper[5200]: I1126 14:23:29.871186 5200 generic.go:358] "Generic (PLEG): container finished" podID="784c38fa-15e5-4430-908d-39b1d74763b1" containerID="a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7" exitCode=0 Nov 26 14:23:29 crc kubenswrapper[5200]: I1126 14:23:29.871297 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dc785" event={"ID":"784c38fa-15e5-4430-908d-39b1d74763b1","Type":"ContainerDied","Data":"a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7"} Nov 26 14:23:31 crc kubenswrapper[5200]: I1126 14:23:31.892640 5200 generic.go:358] "Generic (PLEG): container finished" podID="784c38fa-15e5-4430-908d-39b1d74763b1" containerID="8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a" exitCode=0 Nov 26 14:23:31 crc kubenswrapper[5200]: I1126 14:23:31.892742 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dc785" event={"ID":"784c38fa-15e5-4430-908d-39b1d74763b1","Type":"ContainerDied","Data":"8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a"} Nov 26 14:23:32 crc kubenswrapper[5200]: I1126 14:23:32.909006 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dc785" event={"ID":"784c38fa-15e5-4430-908d-39b1d74763b1","Type":"ContainerStarted","Data":"1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747"} Nov 26 14:23:32 crc kubenswrapper[5200]: I1126 14:23:32.934254 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dc785" podStartSLOduration=4.631697734 podStartE2EDuration="5.934228676s" podCreationTimestamp="2025-11-26 14:23:27 +0000 UTC" firstStartedPulling="2025-11-26 14:23:29.872962225 +0000 UTC m=+3178.552164083" lastFinishedPulling="2025-11-26 14:23:31.175493167 +0000 UTC m=+3179.854695025" observedRunningTime="2025-11-26 14:23:32.928797603 +0000 UTC m=+3181.607999461" watchObservedRunningTime="2025-11-26 14:23:32.934228676 +0000 UTC m=+3181.613430494" Nov 26 14:23:38 crc kubenswrapper[5200]: I1126 14:23:38.181446 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:38 crc kubenswrapper[5200]: I1126 14:23:38.182010 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:38 crc kubenswrapper[5200]: I1126 14:23:38.236639 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:39 crc kubenswrapper[5200]: I1126 14:23:39.016343 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:39 crc kubenswrapper[5200]: I1126 14:23:39.071564 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dc785"] Nov 26 14:23:39 crc kubenswrapper[5200]: I1126 14:23:39.969852 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:23:39 crc kubenswrapper[5200]: E1126 14:23:39.970451 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:23:40 crc kubenswrapper[5200]: I1126 14:23:40.988185 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dc785" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" containerName="registry-server" containerID="cri-o://1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747" gracePeriod=2 Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.391676 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.477490 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-utilities\") pod \"784c38fa-15e5-4430-908d-39b1d74763b1\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.477631 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8cb7\" (UniqueName: \"kubernetes.io/projected/784c38fa-15e5-4430-908d-39b1d74763b1-kube-api-access-g8cb7\") pod \"784c38fa-15e5-4430-908d-39b1d74763b1\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.477761 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-catalog-content\") pod \"784c38fa-15e5-4430-908d-39b1d74763b1\" (UID: \"784c38fa-15e5-4430-908d-39b1d74763b1\") " Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.478902 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-utilities" (OuterVolumeSpecName: "utilities") pod "784c38fa-15e5-4430-908d-39b1d74763b1" (UID: "784c38fa-15e5-4430-908d-39b1d74763b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.487622 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784c38fa-15e5-4430-908d-39b1d74763b1-kube-api-access-g8cb7" (OuterVolumeSpecName: "kube-api-access-g8cb7") pod "784c38fa-15e5-4430-908d-39b1d74763b1" (UID: "784c38fa-15e5-4430-908d-39b1d74763b1"). InnerVolumeSpecName "kube-api-access-g8cb7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.491814 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "784c38fa-15e5-4430-908d-39b1d74763b1" (UID: "784c38fa-15e5-4430-908d-39b1d74763b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.580591 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.580742 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g8cb7\" (UniqueName: \"kubernetes.io/projected/784c38fa-15e5-4430-908d-39b1d74763b1-kube-api-access-g8cb7\") on node \"crc\" DevicePath \"\"" Nov 26 14:23:41 crc kubenswrapper[5200]: I1126 14:23:41.580780 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784c38fa-15e5-4430-908d-39b1d74763b1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:23:41 crc kubenswrapper[5200]: E1126 14:23:41.980727 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1175800 actualBytes=10240 Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.003187 5200 generic.go:358] "Generic (PLEG): container finished" podID="784c38fa-15e5-4430-908d-39b1d74763b1" containerID="1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747" exitCode=0 Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.003344 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dc785" event={"ID":"784c38fa-15e5-4430-908d-39b1d74763b1","Type":"ContainerDied","Data":"1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747"} Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.003391 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dc785" event={"ID":"784c38fa-15e5-4430-908d-39b1d74763b1","Type":"ContainerDied","Data":"7af48ee893daf651140bd636da1a370af501c855277fcbb411284e71a8813278"} Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.003395 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dc785" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.003416 5200 scope.go:117] "RemoveContainer" containerID="1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.043871 5200 scope.go:117] "RemoveContainer" containerID="8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.061937 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dc785"] Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.070140 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dc785"] Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.084976 5200 scope.go:117] "RemoveContainer" containerID="a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.112627 5200 scope.go:117] "RemoveContainer" containerID="1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747" Nov 26 14:23:42 crc kubenswrapper[5200]: E1126 14:23:42.113329 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747\": container with ID starting with 1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747 not found: ID does not exist" containerID="1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.113395 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747"} err="failed to get container status \"1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747\": rpc error: code = NotFound desc = could not find container \"1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747\": container with ID starting with 1af627756f6f06b739ad220618bdc2092a7b21682f93ba69a4cc712d09599747 not found: ID does not exist" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.113430 5200 scope.go:117] "RemoveContainer" containerID="8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a" Nov 26 14:23:42 crc kubenswrapper[5200]: E1126 14:23:42.114218 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a\": container with ID starting with 8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a not found: ID does not exist" containerID="8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.114246 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a"} err="failed to get container status \"8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a\": rpc error: code = NotFound desc = could not find container \"8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a\": container with ID starting with 8bf5dbf3dd0228d0d37bebb7aece8c2732e83d50a21681b08c65ee344420ef4a not found: ID does not exist" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.114262 5200 scope.go:117] "RemoveContainer" containerID="a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7" Nov 26 14:23:42 crc kubenswrapper[5200]: E1126 14:23:42.114729 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7\": container with ID starting with a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7 not found: ID does not exist" containerID="a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.114756 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7"} err="failed to get container status \"a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7\": rpc error: code = NotFound desc = could not find container \"a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7\": container with ID starting with a7618ce25eace8f69861534270257931df8df03e8a50c32dbc9f1517b57616c7 not found: ID does not exist" Nov 26 14:23:42 crc kubenswrapper[5200]: I1126 14:23:42.987260 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" path="/var/lib/kubelet/pods/784c38fa-15e5-4430-908d-39b1d74763b1/volumes" Nov 26 14:23:53 crc kubenswrapper[5200]: I1126 14:23:53.969821 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:23:53 crc kubenswrapper[5200]: E1126 14:23:53.970715 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:24:08 crc kubenswrapper[5200]: I1126 14:24:08.969940 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:24:08 crc kubenswrapper[5200]: E1126 14:24:08.973288 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:24:21 crc kubenswrapper[5200]: I1126 14:24:21.970760 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:24:21 crc kubenswrapper[5200]: E1126 14:24:21.975049 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:24:32 crc kubenswrapper[5200]: I1126 14:24:32.981271 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:24:32 crc kubenswrapper[5200]: E1126 14:24:32.982611 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:24:41 crc kubenswrapper[5200]: E1126 14:24:41.895988 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167882 actualBytes=10240 Nov 26 14:24:47 crc kubenswrapper[5200]: I1126 14:24:47.971210 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:24:47 crc kubenswrapper[5200]: E1126 14:24:47.973156 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:24:58 crc kubenswrapper[5200]: I1126 14:24:58.969450 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:24:58 crc kubenswrapper[5200]: E1126 14:24:58.970679 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:25:11 crc kubenswrapper[5200]: I1126 14:25:11.969994 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:25:11 crc kubenswrapper[5200]: E1126 14:25:11.971019 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:25:26 crc kubenswrapper[5200]: I1126 14:25:26.969540 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:25:26 crc kubenswrapper[5200]: E1126 14:25:26.970278 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:25:36 crc kubenswrapper[5200]: I1126 14:25:36.086710 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:25:36 crc kubenswrapper[5200]: I1126 14:25:36.093483 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:25:36 crc kubenswrapper[5200]: I1126 14:25:36.100568 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:25:36 crc kubenswrapper[5200]: I1126 14:25:36.104775 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:25:37 crc kubenswrapper[5200]: I1126 14:25:37.969911 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:25:38 crc kubenswrapper[5200]: I1126 14:25:38.538155 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"3cf6ab4f0d41bfdef4558cda6ba9cabd76930369bc160e69bc8ee2285596078c"} Nov 26 14:25:41 crc kubenswrapper[5200]: E1126 14:25:41.834636 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169206 actualBytes=10240 Nov 26 14:26:42 crc kubenswrapper[5200]: E1126 14:26:42.085285 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169206 actualBytes=10240 Nov 26 14:27:42 crc kubenswrapper[5200]: E1126 14:27:42.518267 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169205 actualBytes=10240 Nov 26 14:28:03 crc kubenswrapper[5200]: I1126 14:28:03.155658 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:28:03 crc kubenswrapper[5200]: I1126 14:28:03.157080 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.019485 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5q55d"] Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.023817 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" containerName="extract-utilities" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.024277 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" containerName="extract-utilities" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.024321 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" containerName="extract-content" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.024330 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" containerName="extract-content" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.024359 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" containerName="registry-server" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.024367 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" containerName="registry-server" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.024586 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="784c38fa-15e5-4430-908d-39b1d74763b1" containerName="registry-server" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.036793 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5q55d"] Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.037065 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.142897 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-utilities\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.144516 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-catalog-content\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.144724 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cztd8\" (UniqueName: \"kubernetes.io/projected/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-kube-api-access-cztd8\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.246599 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cztd8\" (UniqueName: \"kubernetes.io/projected/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-kube-api-access-cztd8\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.246849 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-utilities\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.246877 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-catalog-content\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.247529 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-catalog-content\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.248148 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-utilities\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.273491 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cztd8\" (UniqueName: \"kubernetes.io/projected/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-kube-api-access-cztd8\") pod \"community-operators-5q55d\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.408420 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:07 crc kubenswrapper[5200]: I1126 14:28:07.961060 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5q55d"] Nov 26 14:28:08 crc kubenswrapper[5200]: I1126 14:28:08.102568 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q55d" event={"ID":"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a","Type":"ContainerStarted","Data":"e2fd5c4fefc179dbfed0e107a76d65550e3b68d7537954dfc9823ceba697174b"} Nov 26 14:28:09 crc kubenswrapper[5200]: I1126 14:28:09.117818 5200 generic.go:358] "Generic (PLEG): container finished" podID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerID="dbc1a6a74a8f569023d3eb8762eeedf6d0f43b18abd30832e8b46e0200289568" exitCode=0 Nov 26 14:28:09 crc kubenswrapper[5200]: I1126 14:28:09.117966 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q55d" event={"ID":"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a","Type":"ContainerDied","Data":"dbc1a6a74a8f569023d3eb8762eeedf6d0f43b18abd30832e8b46e0200289568"} Nov 26 14:28:10 crc kubenswrapper[5200]: I1126 14:28:10.133233 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q55d" event={"ID":"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a","Type":"ContainerStarted","Data":"adbd6a11b7629102e8e7b2a701bb6e6e010f4c4c001a96aee476d48752078863"} Nov 26 14:28:11 crc kubenswrapper[5200]: I1126 14:28:11.148940 5200 generic.go:358] "Generic (PLEG): container finished" podID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerID="adbd6a11b7629102e8e7b2a701bb6e6e010f4c4c001a96aee476d48752078863" exitCode=0 Nov 26 14:28:11 crc kubenswrapper[5200]: I1126 14:28:11.149088 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q55d" event={"ID":"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a","Type":"ContainerDied","Data":"adbd6a11b7629102e8e7b2a701bb6e6e010f4c4c001a96aee476d48752078863"} Nov 26 14:28:12 crc kubenswrapper[5200]: I1126 14:28:12.165572 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q55d" event={"ID":"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a","Type":"ContainerStarted","Data":"4a4a047ef81a78add8924d415c89a31cc7c6cfacc78c9576dbc26f9c0e2cef40"} Nov 26 14:28:12 crc kubenswrapper[5200]: I1126 14:28:12.207464 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5q55d" podStartSLOduration=5.565375257 podStartE2EDuration="6.207420308s" podCreationTimestamp="2025-11-26 14:28:06 +0000 UTC" firstStartedPulling="2025-11-26 14:28:09.119539384 +0000 UTC m=+3457.798741232" lastFinishedPulling="2025-11-26 14:28:09.761584425 +0000 UTC m=+3458.440786283" observedRunningTime="2025-11-26 14:28:12.203680929 +0000 UTC m=+3460.882882817" watchObservedRunningTime="2025-11-26 14:28:12.207420308 +0000 UTC m=+3460.886622136" Nov 26 14:28:17 crc kubenswrapper[5200]: I1126 14:28:17.409242 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:17 crc kubenswrapper[5200]: I1126 14:28:17.410526 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:17 crc kubenswrapper[5200]: I1126 14:28:17.491023 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:18 crc kubenswrapper[5200]: I1126 14:28:18.288166 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:18 crc kubenswrapper[5200]: I1126 14:28:18.354255 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5q55d"] Nov 26 14:28:20 crc kubenswrapper[5200]: I1126 14:28:20.260451 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5q55d" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerName="registry-server" containerID="cri-o://4a4a047ef81a78add8924d415c89a31cc7c6cfacc78c9576dbc26f9c0e2cef40" gracePeriod=2 Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.271216 5200 generic.go:358] "Generic (PLEG): container finished" podID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerID="4a4a047ef81a78add8924d415c89a31cc7c6cfacc78c9576dbc26f9c0e2cef40" exitCode=0 Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.271302 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q55d" event={"ID":"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a","Type":"ContainerDied","Data":"4a4a047ef81a78add8924d415c89a31cc7c6cfacc78c9576dbc26f9c0e2cef40"} Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.272874 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q55d" event={"ID":"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a","Type":"ContainerDied","Data":"e2fd5c4fefc179dbfed0e107a76d65550e3b68d7537954dfc9823ceba697174b"} Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.272898 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2fd5c4fefc179dbfed0e107a76d65550e3b68d7537954dfc9823ceba697174b" Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.279832 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.353434 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-catalog-content\") pod \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.353662 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cztd8\" (UniqueName: \"kubernetes.io/projected/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-kube-api-access-cztd8\") pod \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.353803 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-utilities\") pod \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\" (UID: \"a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a\") " Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.356398 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-utilities" (OuterVolumeSpecName: "utilities") pod "a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" (UID: "a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.360932 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-kube-api-access-cztd8" (OuterVolumeSpecName: "kube-api-access-cztd8") pod "a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" (UID: "a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a"). InnerVolumeSpecName "kube-api-access-cztd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.409659 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" (UID: "a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.457456 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.457516 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:28:21 crc kubenswrapper[5200]: I1126 14:28:21.457541 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cztd8\" (UniqueName: \"kubernetes.io/projected/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a-kube-api-access-cztd8\") on node \"crc\" DevicePath \"\"" Nov 26 14:28:22 crc kubenswrapper[5200]: I1126 14:28:22.282252 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q55d" Nov 26 14:28:22 crc kubenswrapper[5200]: I1126 14:28:22.330033 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5q55d"] Nov 26 14:28:22 crc kubenswrapper[5200]: I1126 14:28:22.335725 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5q55d"] Nov 26 14:28:22 crc kubenswrapper[5200]: I1126 14:28:22.982798 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" path="/var/lib/kubelet/pods/a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a/volumes" Nov 26 14:28:33 crc kubenswrapper[5200]: I1126 14:28:33.154852 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:28:33 crc kubenswrapper[5200]: I1126 14:28:33.155994 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:28:42 crc kubenswrapper[5200]: E1126 14:28:42.677502 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167881 actualBytes=10240 Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.154491 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.155578 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.155663 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.157024 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3cf6ab4f0d41bfdef4558cda6ba9cabd76930369bc160e69bc8ee2285596078c"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.157134 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://3cf6ab4f0d41bfdef4558cda6ba9cabd76930369bc160e69bc8ee2285596078c" gracePeriod=600 Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.327580 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.771544 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="3cf6ab4f0d41bfdef4558cda6ba9cabd76930369bc160e69bc8ee2285596078c" exitCode=0 Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.771841 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"3cf6ab4f0d41bfdef4558cda6ba9cabd76930369bc160e69bc8ee2285596078c"} Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.772119 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d"} Nov 26 14:29:03 crc kubenswrapper[5200]: I1126 14:29:03.772150 5200 scope.go:117] "RemoveContainer" containerID="48d38eb7037026e8843363af1c5c3e474047235087fa335b84c15d8195b60ae0" Nov 26 14:29:42 crc kubenswrapper[5200]: E1126 14:29:42.088343 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167887 actualBytes=10240 Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.172738 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46"] Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.177249 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerName="registry-server" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.177309 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerName="registry-server" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.177365 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerName="extract-content" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.177385 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerName="extract-content" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.177426 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerName="extract-utilities" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.177446 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerName="extract-utilities" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.177989 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4cf90b5-f05c-40ce-b801-d2d1ebc2c62a" containerName="registry-server" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.192768 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46"] Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.193028 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.198560 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.198637 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.368514 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/165ddac3-3c99-4f74-abd2-4fe5be08ab73-config-volume\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.368807 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtxc\" (UniqueName: \"kubernetes.io/projected/165ddac3-3c99-4f74-abd2-4fe5be08ab73-kube-api-access-twtxc\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.368912 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/165ddac3-3c99-4f74-abd2-4fe5be08ab73-secret-volume\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.470780 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/165ddac3-3c99-4f74-abd2-4fe5be08ab73-config-volume\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.470992 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twtxc\" (UniqueName: \"kubernetes.io/projected/165ddac3-3c99-4f74-abd2-4fe5be08ab73-kube-api-access-twtxc\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.471061 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/165ddac3-3c99-4f74-abd2-4fe5be08ab73-secret-volume\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.473535 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/165ddac3-3c99-4f74-abd2-4fe5be08ab73-config-volume\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.495300 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/165ddac3-3c99-4f74-abd2-4fe5be08ab73-secret-volume\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.499238 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtxc\" (UniqueName: \"kubernetes.io/projected/165ddac3-3c99-4f74-abd2-4fe5be08ab73-kube-api-access-twtxc\") pod \"collect-profiles-29402790-r7h46\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.533486 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:00 crc kubenswrapper[5200]: I1126 14:30:00.853771 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46"] Nov 26 14:30:00 crc kubenswrapper[5200]: W1126 14:30:00.864801 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165ddac3_3c99_4f74_abd2_4fe5be08ab73.slice/crio-da9208abe1b1849dde728904dc97a000318b816248bec6551e83cdfc3e05f939 WatchSource:0}: Error finding container da9208abe1b1849dde728904dc97a000318b816248bec6551e83cdfc3e05f939: Status 404 returned error can't find the container with id da9208abe1b1849dde728904dc97a000318b816248bec6551e83cdfc3e05f939 Nov 26 14:30:01 crc kubenswrapper[5200]: I1126 14:30:01.387609 5200 generic.go:358] "Generic (PLEG): container finished" podID="165ddac3-3c99-4f74-abd2-4fe5be08ab73" containerID="c4084af00de440b6712c59eb8eaf4a9a0c9391ad8255782c9f3b99df82884fed" exitCode=0 Nov 26 14:30:01 crc kubenswrapper[5200]: I1126 14:30:01.387777 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" event={"ID":"165ddac3-3c99-4f74-abd2-4fe5be08ab73","Type":"ContainerDied","Data":"c4084af00de440b6712c59eb8eaf4a9a0c9391ad8255782c9f3b99df82884fed"} Nov 26 14:30:01 crc kubenswrapper[5200]: I1126 14:30:01.387833 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" event={"ID":"165ddac3-3c99-4f74-abd2-4fe5be08ab73","Type":"ContainerStarted","Data":"da9208abe1b1849dde728904dc97a000318b816248bec6551e83cdfc3e05f939"} Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.740312 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.806008 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/165ddac3-3c99-4f74-abd2-4fe5be08ab73-secret-volume\") pod \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.806078 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twtxc\" (UniqueName: \"kubernetes.io/projected/165ddac3-3c99-4f74-abd2-4fe5be08ab73-kube-api-access-twtxc\") pod \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.806114 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/165ddac3-3c99-4f74-abd2-4fe5be08ab73-config-volume\") pod \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\" (UID: \"165ddac3-3c99-4f74-abd2-4fe5be08ab73\") " Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.807089 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165ddac3-3c99-4f74-abd2-4fe5be08ab73-config-volume" (OuterVolumeSpecName: "config-volume") pod "165ddac3-3c99-4f74-abd2-4fe5be08ab73" (UID: "165ddac3-3c99-4f74-abd2-4fe5be08ab73"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.812242 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165ddac3-3c99-4f74-abd2-4fe5be08ab73-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "165ddac3-3c99-4f74-abd2-4fe5be08ab73" (UID: "165ddac3-3c99-4f74-abd2-4fe5be08ab73"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.812584 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165ddac3-3c99-4f74-abd2-4fe5be08ab73-kube-api-access-twtxc" (OuterVolumeSpecName: "kube-api-access-twtxc") pod "165ddac3-3c99-4f74-abd2-4fe5be08ab73" (UID: "165ddac3-3c99-4f74-abd2-4fe5be08ab73"). InnerVolumeSpecName "kube-api-access-twtxc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.907792 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twtxc\" (UniqueName: \"kubernetes.io/projected/165ddac3-3c99-4f74-abd2-4fe5be08ab73-kube-api-access-twtxc\") on node \"crc\" DevicePath \"\"" Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.907829 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/165ddac3-3c99-4f74-abd2-4fe5be08ab73-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:30:02 crc kubenswrapper[5200]: I1126 14:30:02.907838 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/165ddac3-3c99-4f74-abd2-4fe5be08ab73-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:30:03 crc kubenswrapper[5200]: I1126 14:30:03.407803 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" Nov 26 14:30:03 crc kubenswrapper[5200]: I1126 14:30:03.407786 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46" event={"ID":"165ddac3-3c99-4f74-abd2-4fe5be08ab73","Type":"ContainerDied","Data":"da9208abe1b1849dde728904dc97a000318b816248bec6551e83cdfc3e05f939"} Nov 26 14:30:03 crc kubenswrapper[5200]: I1126 14:30:03.408461 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da9208abe1b1849dde728904dc97a000318b816248bec6551e83cdfc3e05f939" Nov 26 14:30:03 crc kubenswrapper[5200]: I1126 14:30:03.844932 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg"] Nov 26 14:30:03 crc kubenswrapper[5200]: I1126 14:30:03.855476 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402745-ccbbg"] Nov 26 14:30:04 crc kubenswrapper[5200]: I1126 14:30:04.981151 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0714c26b-3adf-46af-81e6-44003392d822" path="/var/lib/kubelet/pods/0714c26b-3adf-46af-81e6-44003392d822/volumes" Nov 26 14:30:26 crc kubenswrapper[5200]: I1126 14:30:26.505778 5200 scope.go:117] "RemoveContainer" containerID="8159c589137cfaf476da2f9dd08bd77a722b9a7f11a49087e00ef5bffe4c6dc9" Nov 26 14:30:36 crc kubenswrapper[5200]: I1126 14:30:36.232602 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:30:36 crc kubenswrapper[5200]: I1126 14:30:36.240589 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:30:36 crc kubenswrapper[5200]: I1126 14:30:36.252208 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:30:36 crc kubenswrapper[5200]: I1126 14:30:36.258450 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:30:42 crc kubenswrapper[5200]: E1126 14:30:42.232080 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167887 actualBytes=10240 Nov 26 14:31:03 crc kubenswrapper[5200]: I1126 14:31:03.154002 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:31:03 crc kubenswrapper[5200]: I1126 14:31:03.155039 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:31:33 crc kubenswrapper[5200]: I1126 14:31:33.154316 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:31:33 crc kubenswrapper[5200]: I1126 14:31:33.155074 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:31:41 crc kubenswrapper[5200]: E1126 14:31:41.665774 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167886 actualBytes=10240 Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.154994 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.155796 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.155869 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.156958 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.157076 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" gracePeriod=600 Nov 26 14:32:03 crc kubenswrapper[5200]: E1126 14:32:03.290382 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.718077 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" exitCode=0 Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.718144 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d"} Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.718203 5200 scope.go:117] "RemoveContainer" containerID="3cf6ab4f0d41bfdef4558cda6ba9cabd76930369bc160e69bc8ee2285596078c" Nov 26 14:32:03 crc kubenswrapper[5200]: I1126 14:32:03.720080 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:32:03 crc kubenswrapper[5200]: E1126 14:32:03.721968 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.327229 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c2mnd"] Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.331102 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="165ddac3-3c99-4f74-abd2-4fe5be08ab73" containerName="collect-profiles" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.331136 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="165ddac3-3c99-4f74-abd2-4fe5be08ab73" containerName="collect-profiles" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.331502 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="165ddac3-3c99-4f74-abd2-4fe5be08ab73" containerName="collect-profiles" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.379249 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2mnd"] Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.379434 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.449145 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-utilities\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.449243 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv6tf\" (UniqueName: \"kubernetes.io/projected/247e034c-4f27-4ac5-897a-cfe5cfe762b8-kube-api-access-nv6tf\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.449404 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-catalog-content\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.550632 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-utilities\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.550776 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv6tf\" (UniqueName: \"kubernetes.io/projected/247e034c-4f27-4ac5-897a-cfe5cfe762b8-kube-api-access-nv6tf\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.550811 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-catalog-content\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.551321 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-utilities\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.552025 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-catalog-content\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.583266 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv6tf\" (UniqueName: \"kubernetes.io/projected/247e034c-4f27-4ac5-897a-cfe5cfe762b8-kube-api-access-nv6tf\") pod \"certified-operators-c2mnd\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:08 crc kubenswrapper[5200]: I1126 14:32:08.698631 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:09 crc kubenswrapper[5200]: I1126 14:32:09.159271 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2mnd"] Nov 26 14:32:09 crc kubenswrapper[5200]: I1126 14:32:09.784566 5200 generic.go:358] "Generic (PLEG): container finished" podID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerID="c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59" exitCode=0 Nov 26 14:32:09 crc kubenswrapper[5200]: I1126 14:32:09.784864 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2mnd" event={"ID":"247e034c-4f27-4ac5-897a-cfe5cfe762b8","Type":"ContainerDied","Data":"c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59"} Nov 26 14:32:09 crc kubenswrapper[5200]: I1126 14:32:09.785508 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2mnd" event={"ID":"247e034c-4f27-4ac5-897a-cfe5cfe762b8","Type":"ContainerStarted","Data":"397caa08775d1824526e10a6868a3db4aac5d3382f99e46aec1369213d49b26f"} Nov 26 14:32:10 crc kubenswrapper[5200]: I1126 14:32:10.796979 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2mnd" event={"ID":"247e034c-4f27-4ac5-897a-cfe5cfe762b8","Type":"ContainerStarted","Data":"c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c"} Nov 26 14:32:11 crc kubenswrapper[5200]: I1126 14:32:11.809105 5200 generic.go:358] "Generic (PLEG): container finished" podID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerID="c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c" exitCode=0 Nov 26 14:32:11 crc kubenswrapper[5200]: I1126 14:32:11.809267 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2mnd" event={"ID":"247e034c-4f27-4ac5-897a-cfe5cfe762b8","Type":"ContainerDied","Data":"c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c"} Nov 26 14:32:12 crc kubenswrapper[5200]: I1126 14:32:12.824251 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2mnd" event={"ID":"247e034c-4f27-4ac5-897a-cfe5cfe762b8","Type":"ContainerStarted","Data":"1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e"} Nov 26 14:32:12 crc kubenswrapper[5200]: I1126 14:32:12.859108 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c2mnd" podStartSLOduration=4.172769622 podStartE2EDuration="4.859085261s" podCreationTimestamp="2025-11-26 14:32:08 +0000 UTC" firstStartedPulling="2025-11-26 14:32:09.787092582 +0000 UTC m=+3698.466294430" lastFinishedPulling="2025-11-26 14:32:10.473408231 +0000 UTC m=+3699.152610069" observedRunningTime="2025-11-26 14:32:12.847879008 +0000 UTC m=+3701.527080856" watchObservedRunningTime="2025-11-26 14:32:12.859085261 +0000 UTC m=+3701.538287089" Nov 26 14:32:15 crc kubenswrapper[5200]: I1126 14:32:15.970123 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:32:15 crc kubenswrapper[5200]: E1126 14:32:15.971191 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:32:18 crc kubenswrapper[5200]: I1126 14:32:18.699319 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:18 crc kubenswrapper[5200]: I1126 14:32:18.700623 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:18 crc kubenswrapper[5200]: I1126 14:32:18.781110 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:18 crc kubenswrapper[5200]: I1126 14:32:18.930536 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:19 crc kubenswrapper[5200]: I1126 14:32:19.030003 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2mnd"] Nov 26 14:32:20 crc kubenswrapper[5200]: I1126 14:32:20.898480 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c2mnd" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerName="registry-server" containerID="cri-o://1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e" gracePeriod=2 Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.325066 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.388373 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-utilities\") pod \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.388503 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv6tf\" (UniqueName: \"kubernetes.io/projected/247e034c-4f27-4ac5-897a-cfe5cfe762b8-kube-api-access-nv6tf\") pod \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.388588 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-catalog-content\") pod \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\" (UID: \"247e034c-4f27-4ac5-897a-cfe5cfe762b8\") " Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.389385 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-utilities" (OuterVolumeSpecName: "utilities") pod "247e034c-4f27-4ac5-897a-cfe5cfe762b8" (UID: "247e034c-4f27-4ac5-897a-cfe5cfe762b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.398858 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247e034c-4f27-4ac5-897a-cfe5cfe762b8-kube-api-access-nv6tf" (OuterVolumeSpecName: "kube-api-access-nv6tf") pod "247e034c-4f27-4ac5-897a-cfe5cfe762b8" (UID: "247e034c-4f27-4ac5-897a-cfe5cfe762b8"). InnerVolumeSpecName "kube-api-access-nv6tf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.450640 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "247e034c-4f27-4ac5-897a-cfe5cfe762b8" (UID: "247e034c-4f27-4ac5-897a-cfe5cfe762b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.490843 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.490881 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nv6tf\" (UniqueName: \"kubernetes.io/projected/247e034c-4f27-4ac5-897a-cfe5cfe762b8-kube-api-access-nv6tf\") on node \"crc\" DevicePath \"\"" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.490894 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e034c-4f27-4ac5-897a-cfe5cfe762b8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.910749 5200 generic.go:358] "Generic (PLEG): container finished" podID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerID="1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e" exitCode=0 Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.910796 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2mnd" event={"ID":"247e034c-4f27-4ac5-897a-cfe5cfe762b8","Type":"ContainerDied","Data":"1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e"} Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.911959 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2mnd" event={"ID":"247e034c-4f27-4ac5-897a-cfe5cfe762b8","Type":"ContainerDied","Data":"397caa08775d1824526e10a6868a3db4aac5d3382f99e46aec1369213d49b26f"} Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.910891 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2mnd" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.912003 5200 scope.go:117] "RemoveContainer" containerID="1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.946361 5200 scope.go:117] "RemoveContainer" containerID="c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c" Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.973059 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2mnd"] Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.982587 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c2mnd"] Nov 26 14:32:21 crc kubenswrapper[5200]: I1126 14:32:21.995284 5200 scope.go:117] "RemoveContainer" containerID="c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59" Nov 26 14:32:22 crc kubenswrapper[5200]: I1126 14:32:22.021432 5200 scope.go:117] "RemoveContainer" containerID="1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e" Nov 26 14:32:22 crc kubenswrapper[5200]: E1126 14:32:22.023133 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e\": container with ID starting with 1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e not found: ID does not exist" containerID="1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e" Nov 26 14:32:22 crc kubenswrapper[5200]: I1126 14:32:22.023191 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e"} err="failed to get container status \"1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e\": rpc error: code = NotFound desc = could not find container \"1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e\": container with ID starting with 1006ee1a044175db7c31149b98d675067bbb38855fb60eaae457a474694d151e not found: ID does not exist" Nov 26 14:32:22 crc kubenswrapper[5200]: I1126 14:32:22.023224 5200 scope.go:117] "RemoveContainer" containerID="c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c" Nov 26 14:32:22 crc kubenswrapper[5200]: E1126 14:32:22.023667 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c\": container with ID starting with c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c not found: ID does not exist" containerID="c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c" Nov 26 14:32:22 crc kubenswrapper[5200]: I1126 14:32:22.023731 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c"} err="failed to get container status \"c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c\": rpc error: code = NotFound desc = could not find container \"c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c\": container with ID starting with c0d14ee247d794a067e687b5ba51cba0f747b646601b7f7e01266ca9d7d8294c not found: ID does not exist" Nov 26 14:32:22 crc kubenswrapper[5200]: I1126 14:32:22.023761 5200 scope.go:117] "RemoveContainer" containerID="c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59" Nov 26 14:32:22 crc kubenswrapper[5200]: E1126 14:32:22.024112 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59\": container with ID starting with c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59 not found: ID does not exist" containerID="c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59" Nov 26 14:32:22 crc kubenswrapper[5200]: I1126 14:32:22.024146 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59"} err="failed to get container status \"c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59\": rpc error: code = NotFound desc = could not find container \"c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59\": container with ID starting with c72580047d7f7febb304f5364561ea3c7a8983efd4c754d7ef73de60307f2a59 not found: ID does not exist" Nov 26 14:32:22 crc kubenswrapper[5200]: I1126 14:32:22.989602 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" path="/var/lib/kubelet/pods/247e034c-4f27-4ac5-897a-cfe5cfe762b8/volumes" Nov 26 14:32:28 crc kubenswrapper[5200]: I1126 14:32:28.970639 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:32:28 crc kubenswrapper[5200]: E1126 14:32:28.972096 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:32:40 crc kubenswrapper[5200]: I1126 14:32:40.970638 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:32:40 crc kubenswrapper[5200]: E1126 14:32:40.971561 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:32:42 crc kubenswrapper[5200]: E1126 14:32:42.407195 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167885 actualBytes=10240 Nov 26 14:32:51 crc kubenswrapper[5200]: I1126 14:32:51.969851 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:32:51 crc kubenswrapper[5200]: E1126 14:32:51.970899 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:33:06 crc kubenswrapper[5200]: I1126 14:33:06.970016 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:33:06 crc kubenswrapper[5200]: E1126 14:33:06.971515 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.327415 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9z29h"] Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.329221 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerName="extract-content" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.329240 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerName="extract-content" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.329252 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerName="registry-server" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.329259 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerName="registry-server" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.329281 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerName="extract-utilities" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.329288 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerName="extract-utilities" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.329502 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="247e034c-4f27-4ac5-897a-cfe5cfe762b8" containerName="registry-server" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.339474 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.344389 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9z29h"] Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.490565 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-utilities\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.491038 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbzz\" (UniqueName: \"kubernetes.io/projected/7c623e81-b652-4b97-b53b-26a517333ff4-kube-api-access-jfbzz\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.491099 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-catalog-content\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.593110 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbzz\" (UniqueName: \"kubernetes.io/projected/7c623e81-b652-4b97-b53b-26a517333ff4-kube-api-access-jfbzz\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.593168 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-catalog-content\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.593215 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-utilities\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.593836 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-catalog-content\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.593917 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-utilities\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.613936 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbzz\" (UniqueName: \"kubernetes.io/projected/7c623e81-b652-4b97-b53b-26a517333ff4-kube-api-access-jfbzz\") pod \"redhat-operators-9z29h\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:16 crc kubenswrapper[5200]: I1126 14:33:16.669658 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:17 crc kubenswrapper[5200]: I1126 14:33:17.098185 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9z29h"] Nov 26 14:33:17 crc kubenswrapper[5200]: W1126 14:33:17.100555 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c623e81_b652_4b97_b53b_26a517333ff4.slice/crio-6456a05bcb6eb7a45922e39a8c02b2dc014cc431254421af452793bc62b55fc1 WatchSource:0}: Error finding container 6456a05bcb6eb7a45922e39a8c02b2dc014cc431254421af452793bc62b55fc1: Status 404 returned error can't find the container with id 6456a05bcb6eb7a45922e39a8c02b2dc014cc431254421af452793bc62b55fc1 Nov 26 14:33:17 crc kubenswrapper[5200]: I1126 14:33:17.420881 5200 generic.go:358] "Generic (PLEG): container finished" podID="7c623e81-b652-4b97-b53b-26a517333ff4" containerID="9272be6198767e1a9be61164b34c6ae706ecbc5925f4b5bcf9f89e6048cf26d8" exitCode=0 Nov 26 14:33:17 crc kubenswrapper[5200]: I1126 14:33:17.420990 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z29h" event={"ID":"7c623e81-b652-4b97-b53b-26a517333ff4","Type":"ContainerDied","Data":"9272be6198767e1a9be61164b34c6ae706ecbc5925f4b5bcf9f89e6048cf26d8"} Nov 26 14:33:17 crc kubenswrapper[5200]: I1126 14:33:17.421052 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z29h" event={"ID":"7c623e81-b652-4b97-b53b-26a517333ff4","Type":"ContainerStarted","Data":"6456a05bcb6eb7a45922e39a8c02b2dc014cc431254421af452793bc62b55fc1"} Nov 26 14:33:19 crc kubenswrapper[5200]: I1126 14:33:19.440964 5200 generic.go:358] "Generic (PLEG): container finished" podID="7c623e81-b652-4b97-b53b-26a517333ff4" containerID="cbd2b2d7bdcfff9b00ca4b62a0da719c1a2d3906b858f42f0573434c8b3f782f" exitCode=0 Nov 26 14:33:19 crc kubenswrapper[5200]: I1126 14:33:19.441047 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z29h" event={"ID":"7c623e81-b652-4b97-b53b-26a517333ff4","Type":"ContainerDied","Data":"cbd2b2d7bdcfff9b00ca4b62a0da719c1a2d3906b858f42f0573434c8b3f782f"} Nov 26 14:33:19 crc kubenswrapper[5200]: I1126 14:33:19.970754 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:33:19 crc kubenswrapper[5200]: E1126 14:33:19.971220 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:33:20 crc kubenswrapper[5200]: I1126 14:33:20.457924 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z29h" event={"ID":"7c623e81-b652-4b97-b53b-26a517333ff4","Type":"ContainerStarted","Data":"f280040cc6fac8f2b769fa35f25a30c097312c8c3be8e97b4aa50e7cf4a9e23c"} Nov 26 14:33:20 crc kubenswrapper[5200]: I1126 14:33:20.506331 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9z29h" podStartSLOduration=3.5880378 podStartE2EDuration="4.506304726s" podCreationTimestamp="2025-11-26 14:33:16 +0000 UTC" firstStartedPulling="2025-11-26 14:33:17.421823179 +0000 UTC m=+3766.101024997" lastFinishedPulling="2025-11-26 14:33:18.340090065 +0000 UTC m=+3767.019291923" observedRunningTime="2025-11-26 14:33:20.488162407 +0000 UTC m=+3769.167364345" watchObservedRunningTime="2025-11-26 14:33:20.506304726 +0000 UTC m=+3769.185506574" Nov 26 14:33:26 crc kubenswrapper[5200]: I1126 14:33:26.670542 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:26 crc kubenswrapper[5200]: I1126 14:33:26.671484 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:26 crc kubenswrapper[5200]: I1126 14:33:26.730619 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.200539 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9n5t6"] Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.220996 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9n5t6"] Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.221208 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.270502 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdrvv\" (UniqueName: \"kubernetes.io/projected/e6001945-9926-4403-97ea-28b3692a429a-kube-api-access-wdrvv\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.271031 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-utilities\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.271081 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-catalog-content\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.372335 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-utilities\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.372417 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-catalog-content\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.372528 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdrvv\" (UniqueName: \"kubernetes.io/projected/e6001945-9926-4403-97ea-28b3692a429a-kube-api-access-wdrvv\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.373217 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-utilities\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.373237 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-catalog-content\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.397452 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdrvv\" (UniqueName: \"kubernetes.io/projected/e6001945-9926-4403-97ea-28b3692a429a-kube-api-access-wdrvv\") pod \"redhat-marketplace-9n5t6\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.545574 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:27 crc kubenswrapper[5200]: I1126 14:33:27.557624 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:28 crc kubenswrapper[5200]: I1126 14:33:28.003816 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9n5t6"] Nov 26 14:33:28 crc kubenswrapper[5200]: I1126 14:33:28.548899 5200 generic.go:358] "Generic (PLEG): container finished" podID="e6001945-9926-4403-97ea-28b3692a429a" containerID="e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3" exitCode=0 Nov 26 14:33:28 crc kubenswrapper[5200]: I1126 14:33:28.549371 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9n5t6" event={"ID":"e6001945-9926-4403-97ea-28b3692a429a","Type":"ContainerDied","Data":"e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3"} Nov 26 14:33:28 crc kubenswrapper[5200]: I1126 14:33:28.549425 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9n5t6" event={"ID":"e6001945-9926-4403-97ea-28b3692a429a","Type":"ContainerStarted","Data":"1910db1a361b75964c32222dd4975172ff83caf5bcd937c737d18398d06ae0d0"} Nov 26 14:33:29 crc kubenswrapper[5200]: I1126 14:33:29.977126 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9z29h"] Nov 26 14:33:29 crc kubenswrapper[5200]: I1126 14:33:29.977768 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9z29h" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" containerName="registry-server" containerID="cri-o://f280040cc6fac8f2b769fa35f25a30c097312c8c3be8e97b4aa50e7cf4a9e23c" gracePeriod=2 Nov 26 14:33:30 crc kubenswrapper[5200]: I1126 14:33:30.567316 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9n5t6" event={"ID":"e6001945-9926-4403-97ea-28b3692a429a","Type":"ContainerStarted","Data":"7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509"} Nov 26 14:33:31 crc kubenswrapper[5200]: I1126 14:33:31.583919 5200 generic.go:358] "Generic (PLEG): container finished" podID="7c623e81-b652-4b97-b53b-26a517333ff4" containerID="f280040cc6fac8f2b769fa35f25a30c097312c8c3be8e97b4aa50e7cf4a9e23c" exitCode=0 Nov 26 14:33:31 crc kubenswrapper[5200]: I1126 14:33:31.583987 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z29h" event={"ID":"7c623e81-b652-4b97-b53b-26a517333ff4","Type":"ContainerDied","Data":"f280040cc6fac8f2b769fa35f25a30c097312c8c3be8e97b4aa50e7cf4a9e23c"} Nov 26 14:33:31 crc kubenswrapper[5200]: I1126 14:33:31.590541 5200 generic.go:358] "Generic (PLEG): container finished" podID="e6001945-9926-4403-97ea-28b3692a429a" containerID="7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509" exitCode=0 Nov 26 14:33:31 crc kubenswrapper[5200]: I1126 14:33:31.590679 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9n5t6" event={"ID":"e6001945-9926-4403-97ea-28b3692a429a","Type":"ContainerDied","Data":"7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509"} Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.319930 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.481335 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-utilities\") pod \"7c623e81-b652-4b97-b53b-26a517333ff4\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.481400 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-catalog-content\") pod \"7c623e81-b652-4b97-b53b-26a517333ff4\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.481426 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfbzz\" (UniqueName: \"kubernetes.io/projected/7c623e81-b652-4b97-b53b-26a517333ff4-kube-api-access-jfbzz\") pod \"7c623e81-b652-4b97-b53b-26a517333ff4\" (UID: \"7c623e81-b652-4b97-b53b-26a517333ff4\") " Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.483444 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-utilities" (OuterVolumeSpecName: "utilities") pod "7c623e81-b652-4b97-b53b-26a517333ff4" (UID: "7c623e81-b652-4b97-b53b-26a517333ff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.488484 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c623e81-b652-4b97-b53b-26a517333ff4-kube-api-access-jfbzz" (OuterVolumeSpecName: "kube-api-access-jfbzz") pod "7c623e81-b652-4b97-b53b-26a517333ff4" (UID: "7c623e81-b652-4b97-b53b-26a517333ff4"). InnerVolumeSpecName "kube-api-access-jfbzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.567581 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c623e81-b652-4b97-b53b-26a517333ff4" (UID: "7c623e81-b652-4b97-b53b-26a517333ff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.583524 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.583550 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c623e81-b652-4b97-b53b-26a517333ff4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.583565 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jfbzz\" (UniqueName: \"kubernetes.io/projected/7c623e81-b652-4b97-b53b-26a517333ff4-kube-api-access-jfbzz\") on node \"crc\" DevicePath \"\"" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.599537 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z29h" event={"ID":"7c623e81-b652-4b97-b53b-26a517333ff4","Type":"ContainerDied","Data":"6456a05bcb6eb7a45922e39a8c02b2dc014cc431254421af452793bc62b55fc1"} Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.599589 5200 scope.go:117] "RemoveContainer" containerID="f280040cc6fac8f2b769fa35f25a30c097312c8c3be8e97b4aa50e7cf4a9e23c" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.599605 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z29h" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.603085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9n5t6" event={"ID":"e6001945-9926-4403-97ea-28b3692a429a","Type":"ContainerStarted","Data":"975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754"} Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.623636 5200 scope.go:117] "RemoveContainer" containerID="cbd2b2d7bdcfff9b00ca4b62a0da719c1a2d3906b858f42f0573434c8b3f782f" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.627781 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9n5t6" podStartSLOduration=4.037546103 podStartE2EDuration="5.62775096s" podCreationTimestamp="2025-11-26 14:33:27 +0000 UTC" firstStartedPulling="2025-11-26 14:33:28.550573232 +0000 UTC m=+3777.229775090" lastFinishedPulling="2025-11-26 14:33:30.140778109 +0000 UTC m=+3778.819979947" observedRunningTime="2025-11-26 14:33:32.623933587 +0000 UTC m=+3781.303135415" watchObservedRunningTime="2025-11-26 14:33:32.62775096 +0000 UTC m=+3781.306952778" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.654223 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9z29h"] Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.663734 5200 scope.go:117] "RemoveContainer" containerID="9272be6198767e1a9be61164b34c6ae706ecbc5925f4b5bcf9f89e6048cf26d8" Nov 26 14:33:32 crc kubenswrapper[5200]: I1126 14:33:32.665580 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9z29h"] Nov 26 14:33:33 crc kubenswrapper[5200]: I1126 14:33:33.002827 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:33:33 crc kubenswrapper[5200]: E1126 14:33:33.003198 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:33:33 crc kubenswrapper[5200]: I1126 14:33:33.003945 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" path="/var/lib/kubelet/pods/7c623e81-b652-4b97-b53b-26a517333ff4/volumes" Nov 26 14:33:37 crc kubenswrapper[5200]: I1126 14:33:37.546132 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:37 crc kubenswrapper[5200]: I1126 14:33:37.546845 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:37 crc kubenswrapper[5200]: I1126 14:33:37.596866 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:37 crc kubenswrapper[5200]: I1126 14:33:37.697845 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:38 crc kubenswrapper[5200]: I1126 14:33:38.577033 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9n5t6"] Nov 26 14:33:39 crc kubenswrapper[5200]: I1126 14:33:39.669975 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9n5t6" podUID="e6001945-9926-4403-97ea-28b3692a429a" containerName="registry-server" containerID="cri-o://975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754" gracePeriod=2 Nov 26 14:33:39 crc kubenswrapper[5200]: E1126 14:33:39.904448 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6001945_9926_4403_97ea_28b3692a429a.slice/crio-conmon-975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6001945_9926_4403_97ea_28b3692a429a.slice/crio-975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754.scope\": RecentStats: unable to find data in memory cache]" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.099588 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.213260 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdrvv\" (UniqueName: \"kubernetes.io/projected/e6001945-9926-4403-97ea-28b3692a429a-kube-api-access-wdrvv\") pod \"e6001945-9926-4403-97ea-28b3692a429a\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.213385 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-catalog-content\") pod \"e6001945-9926-4403-97ea-28b3692a429a\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.213480 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-utilities\") pod \"e6001945-9926-4403-97ea-28b3692a429a\" (UID: \"e6001945-9926-4403-97ea-28b3692a429a\") " Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.214757 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-utilities" (OuterVolumeSpecName: "utilities") pod "e6001945-9926-4403-97ea-28b3692a429a" (UID: "e6001945-9926-4403-97ea-28b3692a429a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.219965 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6001945-9926-4403-97ea-28b3692a429a-kube-api-access-wdrvv" (OuterVolumeSpecName: "kube-api-access-wdrvv") pod "e6001945-9926-4403-97ea-28b3692a429a" (UID: "e6001945-9926-4403-97ea-28b3692a429a"). InnerVolumeSpecName "kube-api-access-wdrvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.223563 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6001945-9926-4403-97ea-28b3692a429a" (UID: "e6001945-9926-4403-97ea-28b3692a429a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.315199 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.315278 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6001945-9926-4403-97ea-28b3692a429a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.315298 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wdrvv\" (UniqueName: \"kubernetes.io/projected/e6001945-9926-4403-97ea-28b3692a429a-kube-api-access-wdrvv\") on node \"crc\" DevicePath \"\"" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.681853 5200 generic.go:358] "Generic (PLEG): container finished" podID="e6001945-9926-4403-97ea-28b3692a429a" containerID="975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754" exitCode=0 Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.682026 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9n5t6" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.682029 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9n5t6" event={"ID":"e6001945-9926-4403-97ea-28b3692a429a","Type":"ContainerDied","Data":"975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754"} Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.682227 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9n5t6" event={"ID":"e6001945-9926-4403-97ea-28b3692a429a","Type":"ContainerDied","Data":"1910db1a361b75964c32222dd4975172ff83caf5bcd937c737d18398d06ae0d0"} Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.682261 5200 scope.go:117] "RemoveContainer" containerID="975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.706362 5200 scope.go:117] "RemoveContainer" containerID="7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.731931 5200 scope.go:117] "RemoveContainer" containerID="e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.749371 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9n5t6"] Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.756336 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9n5t6"] Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.785906 5200 scope.go:117] "RemoveContainer" containerID="975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754" Nov 26 14:33:40 crc kubenswrapper[5200]: E1126 14:33:40.786454 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754\": container with ID starting with 975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754 not found: ID does not exist" containerID="975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.786527 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754"} err="failed to get container status \"975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754\": rpc error: code = NotFound desc = could not find container \"975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754\": container with ID starting with 975b055112a76d58e96dcb3e4e7628967e81cfbf26beeb67635e6f8da6464754 not found: ID does not exist" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.786562 5200 scope.go:117] "RemoveContainer" containerID="7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509" Nov 26 14:33:40 crc kubenswrapper[5200]: E1126 14:33:40.786981 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509\": container with ID starting with 7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509 not found: ID does not exist" containerID="7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.787052 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509"} err="failed to get container status \"7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509\": rpc error: code = NotFound desc = could not find container \"7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509\": container with ID starting with 7790842f5cf9ace6dc47c4dc196e0d9942b68da0926b92134eb7b68b4b36d509 not found: ID does not exist" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.787098 5200 scope.go:117] "RemoveContainer" containerID="e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3" Nov 26 14:33:40 crc kubenswrapper[5200]: E1126 14:33:40.787401 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3\": container with ID starting with e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3 not found: ID does not exist" containerID="e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.787437 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3"} err="failed to get container status \"e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3\": rpc error: code = NotFound desc = could not find container \"e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3\": container with ID starting with e2bd35937ab258258e0a92638bcbfcb0e8f91e10f797abac661767955f6dd8a3 not found: ID does not exist" Nov 26 14:33:40 crc kubenswrapper[5200]: I1126 14:33:40.981421 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6001945-9926-4403-97ea-28b3692a429a" path="/var/lib/kubelet/pods/e6001945-9926-4403-97ea-28b3692a429a/volumes" Nov 26 14:33:42 crc kubenswrapper[5200]: E1126 14:33:42.002287 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169208 actualBytes=10240 Nov 26 14:33:46 crc kubenswrapper[5200]: I1126 14:33:46.969422 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:33:46 crc kubenswrapper[5200]: E1126 14:33:46.970528 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:33:58 crc kubenswrapper[5200]: I1126 14:33:58.970059 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:33:58 crc kubenswrapper[5200]: E1126 14:33:58.971004 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:34:10 crc kubenswrapper[5200]: I1126 14:34:10.970220 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:34:10 crc kubenswrapper[5200]: E1126 14:34:10.972063 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:34:25 crc kubenswrapper[5200]: I1126 14:34:25.970568 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:34:25 crc kubenswrapper[5200]: E1126 14:34:25.971775 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:34:26 crc kubenswrapper[5200]: I1126 14:34:26.647465 5200 scope.go:117] "RemoveContainer" containerID="dbc1a6a74a8f569023d3eb8762eeedf6d0f43b18abd30832e8b46e0200289568" Nov 26 14:34:26 crc kubenswrapper[5200]: I1126 14:34:26.677971 5200 scope.go:117] "RemoveContainer" containerID="4a4a047ef81a78add8924d415c89a31cc7c6cfacc78c9576dbc26f9c0e2cef40" Nov 26 14:34:26 crc kubenswrapper[5200]: I1126 14:34:26.712055 5200 scope.go:117] "RemoveContainer" containerID="adbd6a11b7629102e8e7b2a701bb6e6e010f4c4c001a96aee476d48752078863" Nov 26 14:34:36 crc kubenswrapper[5200]: I1126 14:34:36.969927 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:34:36 crc kubenswrapper[5200]: E1126 14:34:36.970753 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:34:42 crc kubenswrapper[5200]: E1126 14:34:42.006267 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169211 actualBytes=10240 Nov 26 14:34:48 crc kubenswrapper[5200]: I1126 14:34:48.969436 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:34:48 crc kubenswrapper[5200]: E1126 14:34:48.970284 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:35:00 crc kubenswrapper[5200]: I1126 14:35:00.971379 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:35:00 crc kubenswrapper[5200]: E1126 14:35:00.972897 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:35:15 crc kubenswrapper[5200]: I1126 14:35:15.970276 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:35:15 crc kubenswrapper[5200]: E1126 14:35:15.972108 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:35:29 crc kubenswrapper[5200]: I1126 14:35:29.969749 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:35:29 crc kubenswrapper[5200]: E1126 14:35:29.970575 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:35:36 crc kubenswrapper[5200]: I1126 14:35:36.365201 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:35:36 crc kubenswrapper[5200]: I1126 14:35:36.372120 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:35:36 crc kubenswrapper[5200]: I1126 14:35:36.392405 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:35:36 crc kubenswrapper[5200]: I1126 14:35:36.396852 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:35:42 crc kubenswrapper[5200]: E1126 14:35:42.786440 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169211 actualBytes=10240 Nov 26 14:35:43 crc kubenswrapper[5200]: I1126 14:35:43.971819 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:35:43 crc kubenswrapper[5200]: E1126 14:35:43.972308 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:35:58 crc kubenswrapper[5200]: I1126 14:35:58.969348 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:35:58 crc kubenswrapper[5200]: E1126 14:35:58.970243 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:36:12 crc kubenswrapper[5200]: I1126 14:36:12.985280 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:36:12 crc kubenswrapper[5200]: E1126 14:36:12.986507 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:36:23 crc kubenswrapper[5200]: I1126 14:36:23.969214 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:36:23 crc kubenswrapper[5200]: E1126 14:36:23.969949 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:36:35 crc kubenswrapper[5200]: I1126 14:36:35.970367 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:36:35 crc kubenswrapper[5200]: E1126 14:36:35.972099 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:36:42 crc kubenswrapper[5200]: E1126 14:36:42.190573 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167891 actualBytes=10240 Nov 26 14:36:50 crc kubenswrapper[5200]: I1126 14:36:50.977609 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:36:50 crc kubenswrapper[5200]: E1126 14:36:50.978445 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:37:03 crc kubenswrapper[5200]: I1126 14:37:03.969372 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:37:03 crc kubenswrapper[5200]: I1126 14:37:03.973879 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:37:04 crc kubenswrapper[5200]: I1126 14:37:04.586855 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"6a9a5a8654f2abb1ded2f0b30d9c887200e9ba309a09164d1c2f9405d3a8a84d"} Nov 26 14:37:42 crc kubenswrapper[5200]: E1126 14:37:42.103448 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167889 actualBytes=10240 Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.368350 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lf9mh"] Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.370810 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" containerName="registry-server" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.370837 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" containerName="registry-server" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.370857 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6001945-9926-4403-97ea-28b3692a429a" containerName="extract-content" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.370869 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6001945-9926-4403-97ea-28b3692a429a" containerName="extract-content" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.370885 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6001945-9926-4403-97ea-28b3692a429a" containerName="registry-server" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.370897 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6001945-9926-4403-97ea-28b3692a429a" containerName="registry-server" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.370940 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" containerName="extract-utilities" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.370951 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" containerName="extract-utilities" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.371004 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" containerName="extract-content" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.371016 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" containerName="extract-content" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.371038 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6001945-9926-4403-97ea-28b3692a429a" containerName="extract-utilities" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.371049 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6001945-9926-4403-97ea-28b3692a429a" containerName="extract-utilities" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.371295 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6001945-9926-4403-97ea-28b3692a429a" containerName="registry-server" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.371318 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c623e81-b652-4b97-b53b-26a517333ff4" containerName="registry-server" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.380136 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.392143 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lf9mh"] Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.452207 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6fpt\" (UniqueName: \"kubernetes.io/projected/d0e653c0-a817-496a-8646-9f5caf72a212-kube-api-access-r6fpt\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.452625 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-utilities\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.452808 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-catalog-content\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.554606 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-utilities\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.554747 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-catalog-content\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.554797 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6fpt\" (UniqueName: \"kubernetes.io/projected/d0e653c0-a817-496a-8646-9f5caf72a212-kube-api-access-r6fpt\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.556112 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-catalog-content\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.556098 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-utilities\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.587893 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6fpt\" (UniqueName: \"kubernetes.io/projected/d0e653c0-a817-496a-8646-9f5caf72a212-kube-api-access-r6fpt\") pod \"community-operators-lf9mh\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:31 crc kubenswrapper[5200]: I1126 14:38:31.712875 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:32 crc kubenswrapper[5200]: I1126 14:38:32.251053 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lf9mh"] Nov 26 14:38:32 crc kubenswrapper[5200]: I1126 14:38:32.406916 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lf9mh" event={"ID":"d0e653c0-a817-496a-8646-9f5caf72a212","Type":"ContainerStarted","Data":"82b7065fa31e9acdf2b529ff997d065bb1990c982b490f8c3a43d989c921df78"} Nov 26 14:38:33 crc kubenswrapper[5200]: I1126 14:38:33.415115 5200 generic.go:358] "Generic (PLEG): container finished" podID="d0e653c0-a817-496a-8646-9f5caf72a212" containerID="c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0" exitCode=0 Nov 26 14:38:33 crc kubenswrapper[5200]: I1126 14:38:33.415202 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lf9mh" event={"ID":"d0e653c0-a817-496a-8646-9f5caf72a212","Type":"ContainerDied","Data":"c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0"} Nov 26 14:38:34 crc kubenswrapper[5200]: I1126 14:38:34.428910 5200 generic.go:358] "Generic (PLEG): container finished" podID="d0e653c0-a817-496a-8646-9f5caf72a212" containerID="0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234" exitCode=0 Nov 26 14:38:34 crc kubenswrapper[5200]: I1126 14:38:34.429072 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lf9mh" event={"ID":"d0e653c0-a817-496a-8646-9f5caf72a212","Type":"ContainerDied","Data":"0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234"} Nov 26 14:38:35 crc kubenswrapper[5200]: I1126 14:38:35.442130 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lf9mh" event={"ID":"d0e653c0-a817-496a-8646-9f5caf72a212","Type":"ContainerStarted","Data":"398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349"} Nov 26 14:38:35 crc kubenswrapper[5200]: I1126 14:38:35.467326 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lf9mh" podStartSLOduration=3.868971106 podStartE2EDuration="4.467296153s" podCreationTimestamp="2025-11-26 14:38:31 +0000 UTC" firstStartedPulling="2025-11-26 14:38:33.416260634 +0000 UTC m=+4082.095462452" lastFinishedPulling="2025-11-26 14:38:34.014585681 +0000 UTC m=+4082.693787499" observedRunningTime="2025-11-26 14:38:35.461637203 +0000 UTC m=+4084.140839031" watchObservedRunningTime="2025-11-26 14:38:35.467296153 +0000 UTC m=+4084.146498011" Nov 26 14:38:41 crc kubenswrapper[5200]: I1126 14:38:41.713159 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:41 crc kubenswrapper[5200]: I1126 14:38:41.714051 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:41 crc kubenswrapper[5200]: I1126 14:38:41.755892 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:41 crc kubenswrapper[5200]: E1126 14:38:41.885245 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1175927 actualBytes=10240 Nov 26 14:38:42 crc kubenswrapper[5200]: I1126 14:38:42.547294 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:42 crc kubenswrapper[5200]: I1126 14:38:42.608456 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lf9mh"] Nov 26 14:38:44 crc kubenswrapper[5200]: I1126 14:38:44.523381 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lf9mh" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" containerName="registry-server" containerID="cri-o://398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349" gracePeriod=2 Nov 26 14:38:44 crc kubenswrapper[5200]: I1126 14:38:44.952419 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:44 crc kubenswrapper[5200]: I1126 14:38:44.987591 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-utilities\") pod \"d0e653c0-a817-496a-8646-9f5caf72a212\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " Nov 26 14:38:44 crc kubenswrapper[5200]: I1126 14:38:44.987810 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-catalog-content\") pod \"d0e653c0-a817-496a-8646-9f5caf72a212\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " Nov 26 14:38:44 crc kubenswrapper[5200]: I1126 14:38:44.987875 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6fpt\" (UniqueName: \"kubernetes.io/projected/d0e653c0-a817-496a-8646-9f5caf72a212-kube-api-access-r6fpt\") pod \"d0e653c0-a817-496a-8646-9f5caf72a212\" (UID: \"d0e653c0-a817-496a-8646-9f5caf72a212\") " Nov 26 14:38:44 crc kubenswrapper[5200]: I1126 14:38:44.989254 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-utilities" (OuterVolumeSpecName: "utilities") pod "d0e653c0-a817-496a-8646-9f5caf72a212" (UID: "d0e653c0-a817-496a-8646-9f5caf72a212"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:38:44 crc kubenswrapper[5200]: I1126 14:38:44.995395 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e653c0-a817-496a-8646-9f5caf72a212-kube-api-access-r6fpt" (OuterVolumeSpecName: "kube-api-access-r6fpt") pod "d0e653c0-a817-496a-8646-9f5caf72a212" (UID: "d0e653c0-a817-496a-8646-9f5caf72a212"). InnerVolumeSpecName "kube-api-access-r6fpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.037358 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0e653c0-a817-496a-8646-9f5caf72a212" (UID: "d0e653c0-a817-496a-8646-9f5caf72a212"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.090090 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.090125 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e653c0-a817-496a-8646-9f5caf72a212-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.090138 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r6fpt\" (UniqueName: \"kubernetes.io/projected/d0e653c0-a817-496a-8646-9f5caf72a212-kube-api-access-r6fpt\") on node \"crc\" DevicePath \"\"" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.532505 5200 generic.go:358] "Generic (PLEG): container finished" podID="d0e653c0-a817-496a-8646-9f5caf72a212" containerID="398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349" exitCode=0 Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.532559 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lf9mh" event={"ID":"d0e653c0-a817-496a-8646-9f5caf72a212","Type":"ContainerDied","Data":"398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349"} Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.532657 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lf9mh" event={"ID":"d0e653c0-a817-496a-8646-9f5caf72a212","Type":"ContainerDied","Data":"82b7065fa31e9acdf2b529ff997d065bb1990c982b490f8c3a43d989c921df78"} Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.532599 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lf9mh" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.532711 5200 scope.go:117] "RemoveContainer" containerID="398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.554595 5200 scope.go:117] "RemoveContainer" containerID="0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.576409 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lf9mh"] Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.580146 5200 scope.go:117] "RemoveContainer" containerID="c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.585268 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lf9mh"] Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.616319 5200 scope.go:117] "RemoveContainer" containerID="398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349" Nov 26 14:38:45 crc kubenswrapper[5200]: E1126 14:38:45.616782 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349\": container with ID starting with 398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349 not found: ID does not exist" containerID="398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.616828 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349"} err="failed to get container status \"398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349\": rpc error: code = NotFound desc = could not find container \"398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349\": container with ID starting with 398e7b615e4253dd06273299fb09b5abc8da68020c4f67c5a5174a3fbac9c349 not found: ID does not exist" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.616855 5200 scope.go:117] "RemoveContainer" containerID="0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234" Nov 26 14:38:45 crc kubenswrapper[5200]: E1126 14:38:45.617204 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234\": container with ID starting with 0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234 not found: ID does not exist" containerID="0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.617250 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234"} err="failed to get container status \"0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234\": rpc error: code = NotFound desc = could not find container \"0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234\": container with ID starting with 0831c8c3ce42e6d54730715c0b8ddd8a7a30165e122976d4ec599b7f5ae99234 not found: ID does not exist" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.617282 5200 scope.go:117] "RemoveContainer" containerID="c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0" Nov 26 14:38:45 crc kubenswrapper[5200]: E1126 14:38:45.617507 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0\": container with ID starting with c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0 not found: ID does not exist" containerID="c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0" Nov 26 14:38:45 crc kubenswrapper[5200]: I1126 14:38:45.617533 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0"} err="failed to get container status \"c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0\": rpc error: code = NotFound desc = could not find container \"c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0\": container with ID starting with c78cd3aee68de67d272250cc11899750fc86214a74cea4e5e7083504d6b1a1d0 not found: ID does not exist" Nov 26 14:38:46 crc kubenswrapper[5200]: I1126 14:38:46.983034 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" path="/var/lib/kubelet/pods/d0e653c0-a817-496a-8646-9f5caf72a212/volumes" Nov 26 14:39:33 crc kubenswrapper[5200]: I1126 14:39:33.154339 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:39:33 crc kubenswrapper[5200]: I1126 14:39:33.155133 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:39:41 crc kubenswrapper[5200]: E1126 14:39:41.912272 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167886 actualBytes=10240 Nov 26 14:40:03 crc kubenswrapper[5200]: I1126 14:40:03.153927 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:40:03 crc kubenswrapper[5200]: I1126 14:40:03.154512 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.154533 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.155415 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.155499 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.156630 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a9a5a8654f2abb1ded2f0b30d9c887200e9ba309a09164d1c2f9405d3a8a84d"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.156768 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://6a9a5a8654f2abb1ded2f0b30d9c887200e9ba309a09164d1c2f9405d3a8a84d" gracePeriod=600 Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.613149 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="6a9a5a8654f2abb1ded2f0b30d9c887200e9ba309a09164d1c2f9405d3a8a84d" exitCode=0 Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.613257 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"6a9a5a8654f2abb1ded2f0b30d9c887200e9ba309a09164d1c2f9405d3a8a84d"} Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.613820 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa"} Nov 26 14:40:33 crc kubenswrapper[5200]: I1126 14:40:33.613844 5200 scope.go:117] "RemoveContainer" containerID="b56af3035cc43314ef1de87708d13ba761df802b987259a14cfff2fd42907a6d" Nov 26 14:40:36 crc kubenswrapper[5200]: I1126 14:40:36.513744 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:40:36 crc kubenswrapper[5200]: I1126 14:40:36.522682 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:40:36 crc kubenswrapper[5200]: I1126 14:40:36.541558 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:40:36 crc kubenswrapper[5200]: I1126 14:40:36.549838 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:40:42 crc kubenswrapper[5200]: E1126 14:40:42.040197 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167891 actualBytes=10240 Nov 26 14:41:42 crc kubenswrapper[5200]: E1126 14:41:42.063152 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169215 actualBytes=10240 Nov 26 14:42:33 crc kubenswrapper[5200]: I1126 14:42:33.154588 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:42:33 crc kubenswrapper[5200]: I1126 14:42:33.155337 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:42:42 crc kubenswrapper[5200]: E1126 14:42:42.240496 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1169215 actualBytes=10240 Nov 26 14:43:03 crc kubenswrapper[5200]: I1126 14:43:03.154118 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:43:03 crc kubenswrapper[5200]: I1126 14:43:03.154904 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.155225 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.156432 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.156534 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.157838 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.157959 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" gracePeriod=600 Nov 26 14:43:33 crc kubenswrapper[5200]: E1126 14:43:33.306966 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.347999 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" exitCode=0 Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.348085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa"} Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.348176 5200 scope.go:117] "RemoveContainer" containerID="6a9a5a8654f2abb1ded2f0b30d9c887200e9ba309a09164d1c2f9405d3a8a84d" Nov 26 14:43:33 crc kubenswrapper[5200]: I1126 14:43:33.348779 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:43:33 crc kubenswrapper[5200]: E1126 14:43:33.349271 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.690918 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qkf4t"] Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.693520 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" containerName="extract-utilities" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.693574 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" containerName="extract-utilities" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.693607 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" containerName="extract-content" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.693626 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" containerName="extract-content" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.693755 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" containerName="registry-server" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.693777 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" containerName="registry-server" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.694308 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0e653c0-a817-496a-8646-9f5caf72a212" containerName="registry-server" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.709029 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qkf4t"] Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.709276 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.751300 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nvj7\" (UniqueName: \"kubernetes.io/projected/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-kube-api-access-8nvj7\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.751357 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-catalog-content\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.751415 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-utilities\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.852312 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvj7\" (UniqueName: \"kubernetes.io/projected/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-kube-api-access-8nvj7\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.852655 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-catalog-content\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.852755 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-utilities\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.853292 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-catalog-content\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.853368 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-utilities\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:36 crc kubenswrapper[5200]: I1126 14:43:36.874637 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nvj7\" (UniqueName: \"kubernetes.io/projected/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-kube-api-access-8nvj7\") pod \"redhat-operators-qkf4t\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:37 crc kubenswrapper[5200]: I1126 14:43:37.034501 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:37 crc kubenswrapper[5200]: I1126 14:43:37.455653 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qkf4t"] Nov 26 14:43:37 crc kubenswrapper[5200]: I1126 14:43:37.465329 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:43:38 crc kubenswrapper[5200]: I1126 14:43:38.413056 5200 generic.go:358] "Generic (PLEG): container finished" podID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerID="d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344" exitCode=0 Nov 26 14:43:38 crc kubenswrapper[5200]: I1126 14:43:38.413750 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkf4t" event={"ID":"c92d4fa2-965e-4d78-9e84-66baa6bba5c5","Type":"ContainerDied","Data":"d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344"} Nov 26 14:43:38 crc kubenswrapper[5200]: I1126 14:43:38.413789 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkf4t" event={"ID":"c92d4fa2-965e-4d78-9e84-66baa6bba5c5","Type":"ContainerStarted","Data":"cc2596cd5e74e5b811ec3b422c98e3fd88ab3458b2888ef4fc453978280edcf2"} Nov 26 14:43:40 crc kubenswrapper[5200]: I1126 14:43:40.443114 5200 generic.go:358] "Generic (PLEG): container finished" podID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerID="8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397" exitCode=0 Nov 26 14:43:40 crc kubenswrapper[5200]: I1126 14:43:40.443249 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkf4t" event={"ID":"c92d4fa2-965e-4d78-9e84-66baa6bba5c5","Type":"ContainerDied","Data":"8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397"} Nov 26 14:43:41 crc kubenswrapper[5200]: I1126 14:43:41.456917 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkf4t" event={"ID":"c92d4fa2-965e-4d78-9e84-66baa6bba5c5","Type":"ContainerStarted","Data":"c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019"} Nov 26 14:43:41 crc kubenswrapper[5200]: I1126 14:43:41.482538 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qkf4t" podStartSLOduration=4.615796701 podStartE2EDuration="5.482515165s" podCreationTimestamp="2025-11-26 14:43:36 +0000 UTC" firstStartedPulling="2025-11-26 14:43:38.415189128 +0000 UTC m=+4387.094390946" lastFinishedPulling="2025-11-26 14:43:39.281907592 +0000 UTC m=+4387.961109410" observedRunningTime="2025-11-26 14:43:41.476367703 +0000 UTC m=+4390.155569591" watchObservedRunningTime="2025-11-26 14:43:41.482515165 +0000 UTC m=+4390.161716983" Nov 26 14:43:42 crc kubenswrapper[5200]: E1126 14:43:42.138494 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1177241 actualBytes=10240 Nov 26 14:43:47 crc kubenswrapper[5200]: I1126 14:43:47.035148 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:47 crc kubenswrapper[5200]: I1126 14:43:47.035568 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:47 crc kubenswrapper[5200]: I1126 14:43:47.097215 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:47 crc kubenswrapper[5200]: I1126 14:43:47.561556 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:47 crc kubenswrapper[5200]: I1126 14:43:47.609751 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qkf4t"] Nov 26 14:43:48 crc kubenswrapper[5200]: I1126 14:43:48.977846 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:43:48 crc kubenswrapper[5200]: E1126 14:43:48.979327 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:43:49 crc kubenswrapper[5200]: I1126 14:43:49.517498 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qkf4t" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerName="registry-server" containerID="cri-o://c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019" gracePeriod=2 Nov 26 14:43:49 crc kubenswrapper[5200]: I1126 14:43:49.910129 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.082895 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-catalog-content\") pod \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.083111 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nvj7\" (UniqueName: \"kubernetes.io/projected/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-kube-api-access-8nvj7\") pod \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.083171 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-utilities\") pod \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\" (UID: \"c92d4fa2-965e-4d78-9e84-66baa6bba5c5\") " Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.086836 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-utilities" (OuterVolumeSpecName: "utilities") pod "c92d4fa2-965e-4d78-9e84-66baa6bba5c5" (UID: "c92d4fa2-965e-4d78-9e84-66baa6bba5c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.097052 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-kube-api-access-8nvj7" (OuterVolumeSpecName: "kube-api-access-8nvj7") pod "c92d4fa2-965e-4d78-9e84-66baa6bba5c5" (UID: "c92d4fa2-965e-4d78-9e84-66baa6bba5c5"). InnerVolumeSpecName "kube-api-access-8nvj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.185054 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nvj7\" (UniqueName: \"kubernetes.io/projected/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-kube-api-access-8nvj7\") on node \"crc\" DevicePath \"\"" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.185087 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.531577 5200 generic.go:358] "Generic (PLEG): container finished" podID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerID="c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019" exitCode=0 Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.531735 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkf4t" event={"ID":"c92d4fa2-965e-4d78-9e84-66baa6bba5c5","Type":"ContainerDied","Data":"c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019"} Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.531770 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkf4t" event={"ID":"c92d4fa2-965e-4d78-9e84-66baa6bba5c5","Type":"ContainerDied","Data":"cc2596cd5e74e5b811ec3b422c98e3fd88ab3458b2888ef4fc453978280edcf2"} Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.531785 5200 scope.go:117] "RemoveContainer" containerID="c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.531955 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qkf4t" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.563107 5200 scope.go:117] "RemoveContainer" containerID="8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.588736 5200 scope.go:117] "RemoveContainer" containerID="d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.613777 5200 scope.go:117] "RemoveContainer" containerID="c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019" Nov 26 14:43:50 crc kubenswrapper[5200]: E1126 14:43:50.614470 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019\": container with ID starting with c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019 not found: ID does not exist" containerID="c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.614547 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019"} err="failed to get container status \"c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019\": rpc error: code = NotFound desc = could not find container \"c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019\": container with ID starting with c77569084f759d66d47fded5944ca1ca31035ba1b9daae1b06421c4b758b6019 not found: ID does not exist" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.614597 5200 scope.go:117] "RemoveContainer" containerID="8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397" Nov 26 14:43:50 crc kubenswrapper[5200]: E1126 14:43:50.615078 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397\": container with ID starting with 8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397 not found: ID does not exist" containerID="8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.615188 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397"} err="failed to get container status \"8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397\": rpc error: code = NotFound desc = could not find container \"8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397\": container with ID starting with 8b9076e05e467c8f49d092674496a37fc853ba3c54a972aa1f497c8134625397 not found: ID does not exist" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.615217 5200 scope.go:117] "RemoveContainer" containerID="d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344" Nov 26 14:43:50 crc kubenswrapper[5200]: E1126 14:43:50.615596 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344\": container with ID starting with d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344 not found: ID does not exist" containerID="d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.615650 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344"} err="failed to get container status \"d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344\": rpc error: code = NotFound desc = could not find container \"d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344\": container with ID starting with d5f5847aa2edd057a8f9fe7d710f0005c94236df958e5e32fcd7b89baed78344 not found: ID does not exist" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.940457 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c92d4fa2-965e-4d78-9e84-66baa6bba5c5" (UID: "c92d4fa2-965e-4d78-9e84-66baa6bba5c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:43:50 crc kubenswrapper[5200]: I1126 14:43:50.997829 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92d4fa2-965e-4d78-9e84-66baa6bba5c5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:43:51 crc kubenswrapper[5200]: I1126 14:43:51.159994 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qkf4t"] Nov 26 14:43:51 crc kubenswrapper[5200]: I1126 14:43:51.164642 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qkf4t"] Nov 26 14:43:52 crc kubenswrapper[5200]: I1126 14:43:52.985275 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" path="/var/lib/kubelet/pods/c92d4fa2-965e-4d78-9e84-66baa6bba5c5/volumes" Nov 26 14:44:00 crc kubenswrapper[5200]: I1126 14:44:00.970939 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:44:00 crc kubenswrapper[5200]: E1126 14:44:00.971767 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.020039 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rktm5"] Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.022922 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerName="registry-server" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.023048 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerName="registry-server" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.023194 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerName="extract-content" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.023282 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerName="extract-content" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.023368 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerName="extract-utilities" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.023804 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerName="extract-utilities" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.024091 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c92d4fa2-965e-4d78-9e84-66baa6bba5c5" containerName="registry-server" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.084495 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rktm5"] Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.084803 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.221043 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-utilities\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.221098 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-catalog-content\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.221118 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tgmt\" (UniqueName: \"kubernetes.io/projected/28486335-5ef5-47e1-98f9-ce7f9666a506-kube-api-access-2tgmt\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.322213 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-catalog-content\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.322257 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tgmt\" (UniqueName: \"kubernetes.io/projected/28486335-5ef5-47e1-98f9-ce7f9666a506-kube-api-access-2tgmt\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.322365 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-utilities\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.322835 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-catalog-content\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.322904 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-utilities\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.344229 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tgmt\" (UniqueName: \"kubernetes.io/projected/28486335-5ef5-47e1-98f9-ce7f9666a506-kube-api-access-2tgmt\") pod \"redhat-marketplace-rktm5\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.405847 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:04 crc kubenswrapper[5200]: I1126 14:44:04.815910 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rktm5"] Nov 26 14:44:05 crc kubenswrapper[5200]: I1126 14:44:05.651666 5200 generic.go:358] "Generic (PLEG): container finished" podID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerID="023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d" exitCode=0 Nov 26 14:44:05 crc kubenswrapper[5200]: I1126 14:44:05.651757 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rktm5" event={"ID":"28486335-5ef5-47e1-98f9-ce7f9666a506","Type":"ContainerDied","Data":"023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d"} Nov 26 14:44:05 crc kubenswrapper[5200]: I1126 14:44:05.652145 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rktm5" event={"ID":"28486335-5ef5-47e1-98f9-ce7f9666a506","Type":"ContainerStarted","Data":"916f8e5def77a9eb7d763b62ae1d44ced64a8d876b5783dfa78fb718d9f0542c"} Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.396141 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dlf27"] Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.461752 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlf27"] Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.461954 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.653887 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-catalog-content\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.653962 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-utilities\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.654002 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7vd\" (UniqueName: \"kubernetes.io/projected/a898e136-b2a6-4bbe-ab91-7e0070e79e41-kube-api-access-bd7vd\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.755738 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-catalog-content\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.756127 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-utilities\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.756176 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7vd\" (UniqueName: \"kubernetes.io/projected/a898e136-b2a6-4bbe-ab91-7e0070e79e41-kube-api-access-bd7vd\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.756460 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-catalog-content\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.756620 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-utilities\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.778672 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7vd\" (UniqueName: \"kubernetes.io/projected/a898e136-b2a6-4bbe-ab91-7e0070e79e41-kube-api-access-bd7vd\") pod \"certified-operators-dlf27\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:06 crc kubenswrapper[5200]: I1126 14:44:06.786285 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:07 crc kubenswrapper[5200]: I1126 14:44:07.146080 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlf27"] Nov 26 14:44:07 crc kubenswrapper[5200]: I1126 14:44:07.670230 5200 generic.go:358] "Generic (PLEG): container finished" podID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerID="9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e" exitCode=0 Nov 26 14:44:07 crc kubenswrapper[5200]: I1126 14:44:07.670421 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rktm5" event={"ID":"28486335-5ef5-47e1-98f9-ce7f9666a506","Type":"ContainerDied","Data":"9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e"} Nov 26 14:44:07 crc kubenswrapper[5200]: I1126 14:44:07.677344 5200 generic.go:358] "Generic (PLEG): container finished" podID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerID="25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800" exitCode=0 Nov 26 14:44:07 crc kubenswrapper[5200]: I1126 14:44:07.677716 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlf27" event={"ID":"a898e136-b2a6-4bbe-ab91-7e0070e79e41","Type":"ContainerDied","Data":"25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800"} Nov 26 14:44:07 crc kubenswrapper[5200]: I1126 14:44:07.677832 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlf27" event={"ID":"a898e136-b2a6-4bbe-ab91-7e0070e79e41","Type":"ContainerStarted","Data":"83afe76c5ae6ded8ef7f654467621a58e20721d1998fb8acfa2c4e4749eb90f6"} Nov 26 14:44:08 crc kubenswrapper[5200]: I1126 14:44:08.686720 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rktm5" event={"ID":"28486335-5ef5-47e1-98f9-ce7f9666a506","Type":"ContainerStarted","Data":"fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2"} Nov 26 14:44:08 crc kubenswrapper[5200]: I1126 14:44:08.703355 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rktm5" podStartSLOduration=4.426371367 podStartE2EDuration="5.703330238s" podCreationTimestamp="2025-11-26 14:44:03 +0000 UTC" firstStartedPulling="2025-11-26 14:44:05.652654931 +0000 UTC m=+4414.331856749" lastFinishedPulling="2025-11-26 14:44:06.929613782 +0000 UTC m=+4415.608815620" observedRunningTime="2025-11-26 14:44:08.702314131 +0000 UTC m=+4417.381515959" watchObservedRunningTime="2025-11-26 14:44:08.703330238 +0000 UTC m=+4417.382532066" Nov 26 14:44:10 crc kubenswrapper[5200]: I1126 14:44:10.705697 5200 generic.go:358] "Generic (PLEG): container finished" podID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerID="ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71" exitCode=0 Nov 26 14:44:10 crc kubenswrapper[5200]: I1126 14:44:10.705803 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlf27" event={"ID":"a898e136-b2a6-4bbe-ab91-7e0070e79e41","Type":"ContainerDied","Data":"ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71"} Nov 26 14:44:11 crc kubenswrapper[5200]: I1126 14:44:11.715458 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlf27" event={"ID":"a898e136-b2a6-4bbe-ab91-7e0070e79e41","Type":"ContainerStarted","Data":"92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228"} Nov 26 14:44:11 crc kubenswrapper[5200]: I1126 14:44:11.746300 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dlf27" podStartSLOduration=3.896908629 podStartE2EDuration="5.74627428s" podCreationTimestamp="2025-11-26 14:44:06 +0000 UTC" firstStartedPulling="2025-11-26 14:44:07.678850935 +0000 UTC m=+4416.358052763" lastFinishedPulling="2025-11-26 14:44:09.528216596 +0000 UTC m=+4418.207418414" observedRunningTime="2025-11-26 14:44:11.732957587 +0000 UTC m=+4420.412159465" watchObservedRunningTime="2025-11-26 14:44:11.74627428 +0000 UTC m=+4420.425476138" Nov 26 14:44:12 crc kubenswrapper[5200]: I1126 14:44:12.975779 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:44:12 crc kubenswrapper[5200]: E1126 14:44:12.976492 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:44:14 crc kubenswrapper[5200]: I1126 14:44:14.406838 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:14 crc kubenswrapper[5200]: I1126 14:44:14.407227 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:14 crc kubenswrapper[5200]: I1126 14:44:14.473973 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:14 crc kubenswrapper[5200]: I1126 14:44:14.815421 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:15 crc kubenswrapper[5200]: I1126 14:44:15.592753 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rktm5"] Nov 26 14:44:16 crc kubenswrapper[5200]: I1126 14:44:16.775988 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rktm5" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerName="registry-server" containerID="cri-o://fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2" gracePeriod=2 Nov 26 14:44:16 crc kubenswrapper[5200]: I1126 14:44:16.787457 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:16 crc kubenswrapper[5200]: I1126 14:44:16.787538 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:16 crc kubenswrapper[5200]: I1126 14:44:16.853708 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.219609 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.332120 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-catalog-content\") pod \"28486335-5ef5-47e1-98f9-ce7f9666a506\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.332216 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tgmt\" (UniqueName: \"kubernetes.io/projected/28486335-5ef5-47e1-98f9-ce7f9666a506-kube-api-access-2tgmt\") pod \"28486335-5ef5-47e1-98f9-ce7f9666a506\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.333551 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-utilities\") pod \"28486335-5ef5-47e1-98f9-ce7f9666a506\" (UID: \"28486335-5ef5-47e1-98f9-ce7f9666a506\") " Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.334632 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-utilities" (OuterVolumeSpecName: "utilities") pod "28486335-5ef5-47e1-98f9-ce7f9666a506" (UID: "28486335-5ef5-47e1-98f9-ce7f9666a506"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.339812 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28486335-5ef5-47e1-98f9-ce7f9666a506-kube-api-access-2tgmt" (OuterVolumeSpecName: "kube-api-access-2tgmt") pod "28486335-5ef5-47e1-98f9-ce7f9666a506" (UID: "28486335-5ef5-47e1-98f9-ce7f9666a506"). InnerVolumeSpecName "kube-api-access-2tgmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.344970 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28486335-5ef5-47e1-98f9-ce7f9666a506" (UID: "28486335-5ef5-47e1-98f9-ce7f9666a506"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.435763 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.435794 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28486335-5ef5-47e1-98f9-ce7f9666a506-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.435805 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2tgmt\" (UniqueName: \"kubernetes.io/projected/28486335-5ef5-47e1-98f9-ce7f9666a506-kube-api-access-2tgmt\") on node \"crc\" DevicePath \"\"" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.790963 5200 generic.go:358] "Generic (PLEG): container finished" podID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerID="fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2" exitCode=0 Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.791043 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rktm5" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.791053 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rktm5" event={"ID":"28486335-5ef5-47e1-98f9-ce7f9666a506","Type":"ContainerDied","Data":"fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2"} Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.791097 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rktm5" event={"ID":"28486335-5ef5-47e1-98f9-ce7f9666a506","Type":"ContainerDied","Data":"916f8e5def77a9eb7d763b62ae1d44ced64a8d876b5783dfa78fb718d9f0542c"} Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.791121 5200 scope.go:117] "RemoveContainer" containerID="fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.840030 5200 scope.go:117] "RemoveContainer" containerID="9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.851464 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.854149 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rktm5"] Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.862496 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rktm5"] Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.865057 5200 scope.go:117] "RemoveContainer" containerID="023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.903267 5200 scope.go:117] "RemoveContainer" containerID="fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2" Nov 26 14:44:17 crc kubenswrapper[5200]: E1126 14:44:17.904029 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2\": container with ID starting with fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2 not found: ID does not exist" containerID="fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.904087 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2"} err="failed to get container status \"fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2\": rpc error: code = NotFound desc = could not find container \"fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2\": container with ID starting with fdf10edb69436596c00862b97fefe2872815e6eae22e7878f0fd204bc5337cc2 not found: ID does not exist" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.904124 5200 scope.go:117] "RemoveContainer" containerID="9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e" Nov 26 14:44:17 crc kubenswrapper[5200]: E1126 14:44:17.904486 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e\": container with ID starting with 9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e not found: ID does not exist" containerID="9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.904522 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e"} err="failed to get container status \"9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e\": rpc error: code = NotFound desc = could not find container \"9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e\": container with ID starting with 9d35042cdb11658b7c9e8ff9f9f2b9ddfdd0179cd5548e74533d532f5731370e not found: ID does not exist" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.904540 5200 scope.go:117] "RemoveContainer" containerID="023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d" Nov 26 14:44:17 crc kubenswrapper[5200]: E1126 14:44:17.904818 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d\": container with ID starting with 023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d not found: ID does not exist" containerID="023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d" Nov 26 14:44:17 crc kubenswrapper[5200]: I1126 14:44:17.904842 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d"} err="failed to get container status \"023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d\": rpc error: code = NotFound desc = could not find container \"023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d\": container with ID starting with 023b8f9c90916ce8de668e48881ae5ba18ffb8f472c01be72d7f40f845f51b9d not found: ID does not exist" Nov 26 14:44:18 crc kubenswrapper[5200]: I1126 14:44:18.987596 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" path="/var/lib/kubelet/pods/28486335-5ef5-47e1-98f9-ce7f9666a506/volumes" Nov 26 14:44:19 crc kubenswrapper[5200]: I1126 14:44:19.188283 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlf27"] Nov 26 14:44:19 crc kubenswrapper[5200]: I1126 14:44:19.818011 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dlf27" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerName="registry-server" containerID="cri-o://92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228" gracePeriod=2 Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.307539 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.386181 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-utilities\") pod \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.386302 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-catalog-content\") pod \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.386584 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd7vd\" (UniqueName: \"kubernetes.io/projected/a898e136-b2a6-4bbe-ab91-7e0070e79e41-kube-api-access-bd7vd\") pod \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\" (UID: \"a898e136-b2a6-4bbe-ab91-7e0070e79e41\") " Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.388030 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-utilities" (OuterVolumeSpecName: "utilities") pod "a898e136-b2a6-4bbe-ab91-7e0070e79e41" (UID: "a898e136-b2a6-4bbe-ab91-7e0070e79e41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.394444 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a898e136-b2a6-4bbe-ab91-7e0070e79e41-kube-api-access-bd7vd" (OuterVolumeSpecName: "kube-api-access-bd7vd") pod "a898e136-b2a6-4bbe-ab91-7e0070e79e41" (UID: "a898e136-b2a6-4bbe-ab91-7e0070e79e41"). InnerVolumeSpecName "kube-api-access-bd7vd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.422828 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a898e136-b2a6-4bbe-ab91-7e0070e79e41" (UID: "a898e136-b2a6-4bbe-ab91-7e0070e79e41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.489317 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.489715 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bd7vd\" (UniqueName: \"kubernetes.io/projected/a898e136-b2a6-4bbe-ab91-7e0070e79e41-kube-api-access-bd7vd\") on node \"crc\" DevicePath \"\"" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.489864 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a898e136-b2a6-4bbe-ab91-7e0070e79e41-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.835367 5200 generic.go:358] "Generic (PLEG): container finished" podID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerID="92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228" exitCode=0 Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.835519 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlf27" event={"ID":"a898e136-b2a6-4bbe-ab91-7e0070e79e41","Type":"ContainerDied","Data":"92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228"} Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.835567 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlf27" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.835614 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlf27" event={"ID":"a898e136-b2a6-4bbe-ab91-7e0070e79e41","Type":"ContainerDied","Data":"83afe76c5ae6ded8ef7f654467621a58e20721d1998fb8acfa2c4e4749eb90f6"} Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.835648 5200 scope.go:117] "RemoveContainer" containerID="92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.874708 5200 scope.go:117] "RemoveContainer" containerID="ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.917627 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlf27"] Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.926688 5200 scope.go:117] "RemoveContainer" containerID="25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.931566 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dlf27"] Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.949754 5200 scope.go:117] "RemoveContainer" containerID="92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228" Nov 26 14:44:20 crc kubenswrapper[5200]: E1126 14:44:20.950382 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228\": container with ID starting with 92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228 not found: ID does not exist" containerID="92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.950455 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228"} err="failed to get container status \"92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228\": rpc error: code = NotFound desc = could not find container \"92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228\": container with ID starting with 92e7d13f15710df1d0fddf45cedb238a142e8134cc1588a3b71cf13ad59b2228 not found: ID does not exist" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.950501 5200 scope.go:117] "RemoveContainer" containerID="ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71" Nov 26 14:44:20 crc kubenswrapper[5200]: E1126 14:44:20.951114 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71\": container with ID starting with ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71 not found: ID does not exist" containerID="ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.951218 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71"} err="failed to get container status \"ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71\": rpc error: code = NotFound desc = could not find container \"ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71\": container with ID starting with ddc575875cc199b4ff21901e5a227b5ddc04971193284e645173725c1fe6ab71 not found: ID does not exist" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.951276 5200 scope.go:117] "RemoveContainer" containerID="25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800" Nov 26 14:44:20 crc kubenswrapper[5200]: E1126 14:44:20.951710 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800\": container with ID starting with 25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800 not found: ID does not exist" containerID="25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.951765 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800"} err="failed to get container status \"25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800\": rpc error: code = NotFound desc = could not find container \"25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800\": container with ID starting with 25eafc2ffd19e2e3e2bc386921362b9d66bd0a99706ce475af83cf646d31a800 not found: ID does not exist" Nov 26 14:44:20 crc kubenswrapper[5200]: I1126 14:44:20.984138 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" path="/var/lib/kubelet/pods/a898e136-b2a6-4bbe-ab91-7e0070e79e41/volumes" Nov 26 14:44:27 crc kubenswrapper[5200]: I1126 14:44:27.969852 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:44:27 crc kubenswrapper[5200]: E1126 14:44:27.971332 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:44:41 crc kubenswrapper[5200]: E1126 14:44:41.916477 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167885 actualBytes=10240 Nov 26 14:44:41 crc kubenswrapper[5200]: I1126 14:44:41.969761 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:44:41 crc kubenswrapper[5200]: E1126 14:44:41.969933 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:44:54 crc kubenswrapper[5200]: I1126 14:44:54.969839 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:44:54 crc kubenswrapper[5200]: E1126 14:44:54.971171 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.171367 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp"] Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.172904 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerName="registry-server" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.172929 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerName="registry-server" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.172951 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerName="extract-utilities" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.172960 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerName="extract-utilities" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.172992 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerName="extract-content" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173000 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerName="extract-content" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173018 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerName="extract-utilities" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173025 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerName="extract-utilities" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173052 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerName="extract-content" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173060 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerName="extract-content" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173076 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerName="registry-server" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173083 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerName="registry-server" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173249 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a898e136-b2a6-4bbe-ab91-7e0070e79e41" containerName="registry-server" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.173271 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="28486335-5ef5-47e1-98f9-ce7f9666a506" containerName="registry-server" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.181762 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp"] Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.181889 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.184002 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.185624 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.314080 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9783529c-b45f-464e-b451-ef77c2d0f41f-secret-volume\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.314143 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-298nr\" (UniqueName: \"kubernetes.io/projected/9783529c-b45f-464e-b451-ef77c2d0f41f-kube-api-access-298nr\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.314285 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9783529c-b45f-464e-b451-ef77c2d0f41f-config-volume\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.416022 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9783529c-b45f-464e-b451-ef77c2d0f41f-config-volume\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.416159 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9783529c-b45f-464e-b451-ef77c2d0f41f-secret-volume\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.416241 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-298nr\" (UniqueName: \"kubernetes.io/projected/9783529c-b45f-464e-b451-ef77c2d0f41f-kube-api-access-298nr\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.417296 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9783529c-b45f-464e-b451-ef77c2d0f41f-config-volume\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.428486 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9783529c-b45f-464e-b451-ef77c2d0f41f-secret-volume\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.437093 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-298nr\" (UniqueName: \"kubernetes.io/projected/9783529c-b45f-464e-b451-ef77c2d0f41f-kube-api-access-298nr\") pod \"collect-profiles-29402805-7hsqp\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.505481 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:00 crc kubenswrapper[5200]: I1126 14:45:00.807440 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp"] Nov 26 14:45:01 crc kubenswrapper[5200]: I1126 14:45:01.182070 5200 generic.go:358] "Generic (PLEG): container finished" podID="9783529c-b45f-464e-b451-ef77c2d0f41f" containerID="0dd57eb0ded6b4a153a2992cee09497d1e91a43a64584d54fda46ffdcb753f1c" exitCode=0 Nov 26 14:45:01 crc kubenswrapper[5200]: I1126 14:45:01.182145 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" event={"ID":"9783529c-b45f-464e-b451-ef77c2d0f41f","Type":"ContainerDied","Data":"0dd57eb0ded6b4a153a2992cee09497d1e91a43a64584d54fda46ffdcb753f1c"} Nov 26 14:45:01 crc kubenswrapper[5200]: I1126 14:45:01.183647 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" event={"ID":"9783529c-b45f-464e-b451-ef77c2d0f41f","Type":"ContainerStarted","Data":"13d0c41933e6b2e9c9b34d989a3b040621fae880fad872bd2f1613a1337d4d34"} Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.469437 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.645846 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9783529c-b45f-464e-b451-ef77c2d0f41f-config-volume\") pod \"9783529c-b45f-464e-b451-ef77c2d0f41f\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.645989 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-298nr\" (UniqueName: \"kubernetes.io/projected/9783529c-b45f-464e-b451-ef77c2d0f41f-kube-api-access-298nr\") pod \"9783529c-b45f-464e-b451-ef77c2d0f41f\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.646074 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9783529c-b45f-464e-b451-ef77c2d0f41f-secret-volume\") pod \"9783529c-b45f-464e-b451-ef77c2d0f41f\" (UID: \"9783529c-b45f-464e-b451-ef77c2d0f41f\") " Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.647163 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9783529c-b45f-464e-b451-ef77c2d0f41f-config-volume" (OuterVolumeSpecName: "config-volume") pod "9783529c-b45f-464e-b451-ef77c2d0f41f" (UID: "9783529c-b45f-464e-b451-ef77c2d0f41f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.651392 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9783529c-b45f-464e-b451-ef77c2d0f41f-kube-api-access-298nr" (OuterVolumeSpecName: "kube-api-access-298nr") pod "9783529c-b45f-464e-b451-ef77c2d0f41f" (UID: "9783529c-b45f-464e-b451-ef77c2d0f41f"). InnerVolumeSpecName "kube-api-access-298nr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.653153 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9783529c-b45f-464e-b451-ef77c2d0f41f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9783529c-b45f-464e-b451-ef77c2d0f41f" (UID: "9783529c-b45f-464e-b451-ef77c2d0f41f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.747883 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-298nr\" (UniqueName: \"kubernetes.io/projected/9783529c-b45f-464e-b451-ef77c2d0f41f-kube-api-access-298nr\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.747911 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9783529c-b45f-464e-b451-ef77c2d0f41f-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:02 crc kubenswrapper[5200]: I1126 14:45:02.747920 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9783529c-b45f-464e-b451-ef77c2d0f41f-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:03 crc kubenswrapper[5200]: I1126 14:45:03.202719 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" event={"ID":"9783529c-b45f-464e-b451-ef77c2d0f41f","Type":"ContainerDied","Data":"13d0c41933e6b2e9c9b34d989a3b040621fae880fad872bd2f1613a1337d4d34"} Nov 26 14:45:03 crc kubenswrapper[5200]: I1126 14:45:03.202752 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp" Nov 26 14:45:03 crc kubenswrapper[5200]: I1126 14:45:03.202772 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d0c41933e6b2e9c9b34d989a3b040621fae880fad872bd2f1613a1337d4d34" Nov 26 14:45:03 crc kubenswrapper[5200]: I1126 14:45:03.539958 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d"] Nov 26 14:45:03 crc kubenswrapper[5200]: I1126 14:45:03.545238 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402760-rxf2d"] Nov 26 14:45:04 crc kubenswrapper[5200]: I1126 14:45:04.978562 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236b4a36-aa29-49ef-a0f2-eb61772cb764" path="/var/lib/kubelet/pods/236b4a36-aa29-49ef-a0f2-eb61772cb764/volumes" Nov 26 14:45:07 crc kubenswrapper[5200]: I1126 14:45:07.969533 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:45:07 crc kubenswrapper[5200]: E1126 14:45:07.970887 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:45:20 crc kubenswrapper[5200]: I1126 14:45:20.970211 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:45:20 crc kubenswrapper[5200]: E1126 14:45:20.971408 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.073842 5200 scope.go:117] "RemoveContainer" containerID="c38c3385081ffc236fe3f1b574995afe5179a1ed2a52dff40b8e59e67daeae26" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.516748 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ppxwn"] Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.522988 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ppxwn"] Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.665380 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ll2vq"] Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.666658 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9783529c-b45f-464e-b451-ef77c2d0f41f" containerName="collect-profiles" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.666705 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9783529c-b45f-464e-b451-ef77c2d0f41f" containerName="collect-profiles" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.666971 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9783529c-b45f-464e-b451-ef77c2d0f41f" containerName="collect-profiles" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.795235 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ll2vq"] Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.795410 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.798041 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"crc-storage\"/\"crc-storage-dockercfg-c4cvj\"" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.798082 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"openshift-service-ca.crt\"" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.798214 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"kube-root-ca.crt\"" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.798361 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"crc-storage\"" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.877093 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bb5b\" (UniqueName: \"kubernetes.io/projected/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-kube-api-access-7bb5b\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.877206 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-node-mnt\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.877261 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-crc-storage\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.978903 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bb5b\" (UniqueName: \"kubernetes.io/projected/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-kube-api-access-7bb5b\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.978975 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-node-mnt\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.979029 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-crc-storage\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.979411 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-node-mnt\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:27 crc kubenswrapper[5200]: I1126 14:45:27.980771 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-crc-storage\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:28 crc kubenswrapper[5200]: I1126 14:45:28.011796 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bb5b\" (UniqueName: \"kubernetes.io/projected/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-kube-api-access-7bb5b\") pod \"crc-storage-crc-ll2vq\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:28 crc kubenswrapper[5200]: I1126 14:45:28.117512 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:28 crc kubenswrapper[5200]: I1126 14:45:28.578676 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ll2vq"] Nov 26 14:45:28 crc kubenswrapper[5200]: I1126 14:45:28.981789 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f17fa9cf-7538-4315-a9c8-5542da7f1eaf" path="/var/lib/kubelet/pods/f17fa9cf-7538-4315-a9c8-5542da7f1eaf/volumes" Nov 26 14:45:29 crc kubenswrapper[5200]: I1126 14:45:29.402856 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ll2vq" event={"ID":"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82","Type":"ContainerStarted","Data":"1ef4acf8726e29306ed642b5bc0fcabe90c11b8983b10dc2d67406393ff28fe2"} Nov 26 14:45:31 crc kubenswrapper[5200]: I1126 14:45:31.432569 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ll2vq" event={"ID":"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82","Type":"ContainerStarted","Data":"f650c3431ad11a5b3b370cade64429dabda9a6022b34217a543ae6d6f2bfc026"} Nov 26 14:45:31 crc kubenswrapper[5200]: I1126 14:45:31.455756 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-ll2vq" podStartSLOduration=2.215074262 podStartE2EDuration="4.455670994s" podCreationTimestamp="2025-11-26 14:45:27 +0000 UTC" firstStartedPulling="2025-11-26 14:45:28.589634827 +0000 UTC m=+4497.268836675" lastFinishedPulling="2025-11-26 14:45:30.830231589 +0000 UTC m=+4499.509433407" observedRunningTime="2025-11-26 14:45:31.448379891 +0000 UTC m=+4500.127581729" watchObservedRunningTime="2025-11-26 14:45:31.455670994 +0000 UTC m=+4500.134872822" Nov 26 14:45:32 crc kubenswrapper[5200]: I1126 14:45:32.439906 5200 generic.go:358] "Generic (PLEG): container finished" podID="2fba0d6f-c7a8-4fdf-bd80-420f6b648d82" containerID="f650c3431ad11a5b3b370cade64429dabda9a6022b34217a543ae6d6f2bfc026" exitCode=0 Nov 26 14:45:32 crc kubenswrapper[5200]: I1126 14:45:32.439951 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ll2vq" event={"ID":"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82","Type":"ContainerDied","Data":"f650c3431ad11a5b3b370cade64429dabda9a6022b34217a543ae6d6f2bfc026"} Nov 26 14:45:33 crc kubenswrapper[5200]: I1126 14:45:33.872274 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:33 crc kubenswrapper[5200]: I1126 14:45:33.982231 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bb5b\" (UniqueName: \"kubernetes.io/projected/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-kube-api-access-7bb5b\") pod \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " Nov 26 14:45:33 crc kubenswrapper[5200]: I1126 14:45:33.982309 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-node-mnt\") pod \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " Nov 26 14:45:33 crc kubenswrapper[5200]: I1126 14:45:33.982353 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-crc-storage\") pod \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\" (UID: \"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82\") " Nov 26 14:45:33 crc kubenswrapper[5200]: I1126 14:45:33.982763 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "2fba0d6f-c7a8-4fdf-bd80-420f6b648d82" (UID: "2fba0d6f-c7a8-4fdf-bd80-420f6b648d82"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 14:45:33 crc kubenswrapper[5200]: I1126 14:45:33.987897 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-kube-api-access-7bb5b" (OuterVolumeSpecName: "kube-api-access-7bb5b") pod "2fba0d6f-c7a8-4fdf-bd80-420f6b648d82" (UID: "2fba0d6f-c7a8-4fdf-bd80-420f6b648d82"). InnerVolumeSpecName "kube-api-access-7bb5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:45:34 crc kubenswrapper[5200]: I1126 14:45:34.000553 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "2fba0d6f-c7a8-4fdf-bd80-420f6b648d82" (UID: "2fba0d6f-c7a8-4fdf-bd80-420f6b648d82"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:45:34 crc kubenswrapper[5200]: I1126 14:45:34.083827 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7bb5b\" (UniqueName: \"kubernetes.io/projected/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-kube-api-access-7bb5b\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:34 crc kubenswrapper[5200]: I1126 14:45:34.083866 5200 reconciler_common.go:299] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:34 crc kubenswrapper[5200]: I1126 14:45:34.083885 5200 reconciler_common.go:299] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:34 crc kubenswrapper[5200]: I1126 14:45:34.459427 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ll2vq" Nov 26 14:45:34 crc kubenswrapper[5200]: I1126 14:45:34.459450 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ll2vq" event={"ID":"2fba0d6f-c7a8-4fdf-bd80-420f6b648d82","Type":"ContainerDied","Data":"1ef4acf8726e29306ed642b5bc0fcabe90c11b8983b10dc2d67406393ff28fe2"} Nov 26 14:45:34 crc kubenswrapper[5200]: I1126 14:45:34.459979 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ef4acf8726e29306ed642b5bc0fcabe90c11b8983b10dc2d67406393ff28fe2" Nov 26 14:45:34 crc kubenswrapper[5200]: E1126 14:45:34.620955 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fba0d6f_c7a8_4fdf_bd80_420f6b648d82.slice/crio-1ef4acf8726e29306ed642b5bc0fcabe90c11b8983b10dc2d67406393ff28fe2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fba0d6f_c7a8_4fdf_bd80_420f6b648d82.slice\": RecentStats: unable to find data in memory cache]" Nov 26 14:45:34 crc kubenswrapper[5200]: I1126 14:45:34.969369 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:45:34 crc kubenswrapper[5200]: E1126 14:45:34.970017 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.582356 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ll2vq"] Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.588864 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ll2vq"] Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.702932 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-4f8lp"] Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.704113 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fba0d6f-c7a8-4fdf-bd80-420f6b648d82" containerName="storage" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.704139 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fba0d6f-c7a8-4fdf-bd80-420f6b648d82" containerName="storage" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.704313 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fba0d6f-c7a8-4fdf-bd80-420f6b648d82" containerName="storage" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.811825 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4f8lp"] Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.811969 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.818352 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"crc-storage\"" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.818347 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"openshift-service-ca.crt\"" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.818413 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"crc-storage\"/\"crc-storage-dockercfg-c4cvj\"" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.818946 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"crc-storage\"/\"kube-root-ca.crt\"" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.916621 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5fw\" (UniqueName: \"kubernetes.io/projected/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-kube-api-access-xl5fw\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.916724 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-node-mnt\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:35 crc kubenswrapper[5200]: I1126 14:45:35.916855 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-crc-storage\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.019173 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-crc-storage\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.019526 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl5fw\" (UniqueName: \"kubernetes.io/projected/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-kube-api-access-xl5fw\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.019642 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-node-mnt\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.020229 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-node-mnt\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.021043 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-crc-storage\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.058468 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl5fw\" (UniqueName: \"kubernetes.io/projected/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-kube-api-access-xl5fw\") pod \"crc-storage-crc-4f8lp\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.130083 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.563795 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4f8lp"] Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.655382 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.661581 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.674618 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:45:36 crc kubenswrapper[5200]: I1126 14:45:36.678876 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:45:37 crc kubenswrapper[5200]: I1126 14:45:37.008524 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fba0d6f-c7a8-4fdf-bd80-420f6b648d82" path="/var/lib/kubelet/pods/2fba0d6f-c7a8-4fdf-bd80-420f6b648d82/volumes" Nov 26 14:45:37 crc kubenswrapper[5200]: I1126 14:45:37.483710 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4f8lp" event={"ID":"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e","Type":"ContainerStarted","Data":"02b448346d5ed86ec211871f7b43f90dec6b51b467d3b55bd909c8c32442ee18"} Nov 26 14:45:39 crc kubenswrapper[5200]: I1126 14:45:39.502601 5200 generic.go:358] "Generic (PLEG): container finished" podID="1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e" containerID="20b45ed8b7eb52fb117a648f4da78248e35ce809e4f4978347f7b09fc9cf042f" exitCode=0 Nov 26 14:45:39 crc kubenswrapper[5200]: I1126 14:45:39.502711 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4f8lp" event={"ID":"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e","Type":"ContainerDied","Data":"20b45ed8b7eb52fb117a648f4da78248e35ce809e4f4978347f7b09fc9cf042f"} Nov 26 14:45:40 crc kubenswrapper[5200]: I1126 14:45:40.806641 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:40 crc kubenswrapper[5200]: I1126 14:45:40.903131 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl5fw\" (UniqueName: \"kubernetes.io/projected/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-kube-api-access-xl5fw\") pod \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " Nov 26 14:45:40 crc kubenswrapper[5200]: I1126 14:45:40.903272 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-crc-storage\") pod \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " Nov 26 14:45:40 crc kubenswrapper[5200]: I1126 14:45:40.903388 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-node-mnt\") pod \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\" (UID: \"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e\") " Nov 26 14:45:40 crc kubenswrapper[5200]: I1126 14:45:40.903524 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e" (UID: "1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 14:45:40 crc kubenswrapper[5200]: I1126 14:45:40.903850 5200 reconciler_common.go:299] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:40 crc kubenswrapper[5200]: I1126 14:45:40.909421 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-kube-api-access-xl5fw" (OuterVolumeSpecName: "kube-api-access-xl5fw") pod "1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e" (UID: "1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e"). InnerVolumeSpecName "kube-api-access-xl5fw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:45:40 crc kubenswrapper[5200]: I1126 14:45:40.936192 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e" (UID: "1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:45:41 crc kubenswrapper[5200]: I1126 14:45:41.005029 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xl5fw\" (UniqueName: \"kubernetes.io/projected/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-kube-api-access-xl5fw\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:41 crc kubenswrapper[5200]: I1126 14:45:41.005076 5200 reconciler_common.go:299] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 26 14:45:41 crc kubenswrapper[5200]: I1126 14:45:41.516949 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4f8lp" Nov 26 14:45:41 crc kubenswrapper[5200]: I1126 14:45:41.516967 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4f8lp" event={"ID":"1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e","Type":"ContainerDied","Data":"02b448346d5ed86ec211871f7b43f90dec6b51b467d3b55bd909c8c32442ee18"} Nov 26 14:45:41 crc kubenswrapper[5200]: I1126 14:45:41.517028 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02b448346d5ed86ec211871f7b43f90dec6b51b467d3b55bd909c8c32442ee18" Nov 26 14:45:41 crc kubenswrapper[5200]: E1126 14:45:41.967942 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167884 actualBytes=10240 Nov 26 14:45:48 crc kubenswrapper[5200]: I1126 14:45:48.980490 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:45:48 crc kubenswrapper[5200]: E1126 14:45:48.981194 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:45:59 crc kubenswrapper[5200]: I1126 14:45:59.969272 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:45:59 crc kubenswrapper[5200]: E1126 14:45:59.970167 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:46:14 crc kubenswrapper[5200]: I1126 14:46:14.971753 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:46:14 crc kubenswrapper[5200]: E1126 14:46:14.972827 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:46:26 crc kubenswrapper[5200]: I1126 14:46:26.970632 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:46:26 crc kubenswrapper[5200]: E1126 14:46:26.971587 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:46:27 crc kubenswrapper[5200]: I1126 14:46:27.133794 5200 scope.go:117] "RemoveContainer" containerID="a7b87f4f526bf4eed31c217753c0fbdb7761d4e01bcfff502767efb319ac15ff" Nov 26 14:46:38 crc kubenswrapper[5200]: I1126 14:46:38.969557 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:46:38 crc kubenswrapper[5200]: E1126 14:46:38.972228 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:46:42 crc kubenswrapper[5200]: E1126 14:46:42.248223 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167888 actualBytes=10240 Nov 26 14:46:52 crc kubenswrapper[5200]: I1126 14:46:52.983559 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:46:52 crc kubenswrapper[5200]: E1126 14:46:52.984519 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:47:03 crc kubenswrapper[5200]: I1126 14:47:03.969933 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:47:03 crc kubenswrapper[5200]: E1126 14:47:03.972064 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:47:14 crc kubenswrapper[5200]: I1126 14:47:14.983090 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:47:14 crc kubenswrapper[5200]: E1126 14:47:14.989516 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:47:28 crc kubenswrapper[5200]: I1126 14:47:28.984211 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:47:28 crc kubenswrapper[5200]: E1126 14:47:28.985307 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:47:42 crc kubenswrapper[5200]: I1126 14:47:42.978218 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:47:42 crc kubenswrapper[5200]: E1126 14:47:42.979180 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:47:43 crc kubenswrapper[5200]: E1126 14:47:43.101196 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1167888 actualBytes=10240 Nov 26 14:47:56 crc kubenswrapper[5200]: I1126 14:47:56.972063 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:47:56 crc kubenswrapper[5200]: E1126 14:47:56.973441 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:48:10 crc kubenswrapper[5200]: I1126 14:48:10.969678 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:48:10 crc kubenswrapper[5200]: E1126 14:48:10.971241 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:48:22 crc kubenswrapper[5200]: I1126 14:48:22.983129 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:48:22 crc kubenswrapper[5200]: E1126 14:48:22.983907 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.712360 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qbsqz"] Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.715978 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e" containerName="storage" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.716005 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e" containerName="storage" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.716962 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fbde6c6-c3f8-46b7-89e1-e9c0bd960e0e" containerName="storage" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.744087 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbsqz"] Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.744284 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.818352 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twccz\" (UniqueName: \"kubernetes.io/projected/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-kube-api-access-twccz\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.818498 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-catalog-content\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.818569 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-utilities\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.919780 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twccz\" (UniqueName: \"kubernetes.io/projected/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-kube-api-access-twccz\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.919880 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-catalog-content\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.919916 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-utilities\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.920659 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-utilities\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.921624 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-catalog-content\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.942102 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twccz\" (UniqueName: \"kubernetes.io/projected/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-kube-api-access-twccz\") pod \"community-operators-qbsqz\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.969466 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:48:37 crc kubenswrapper[5200]: I1126 14:48:37.971890 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:48:38 crc kubenswrapper[5200]: I1126 14:48:38.077242 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:38 crc kubenswrapper[5200]: I1126 14:48:38.189920 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"ea3a2115a282f6ba849c903139ac342ea4d280f4ca9dbebec0216cb6655bfa03"} Nov 26 14:48:38 crc kubenswrapper[5200]: I1126 14:48:38.591764 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbsqz"] Nov 26 14:48:38 crc kubenswrapper[5200]: W1126 14:48:38.600812 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f9a73c_f85c_4bde_9229_ed9e7e5bcbfa.slice/crio-c081f0a5a2d9e35719267da8214f63e361030df4fddf40e59ccd7d58fafe47c3 WatchSource:0}: Error finding container c081f0a5a2d9e35719267da8214f63e361030df4fddf40e59ccd7d58fafe47c3: Status 404 returned error can't find the container with id c081f0a5a2d9e35719267da8214f63e361030df4fddf40e59ccd7d58fafe47c3 Nov 26 14:48:39 crc kubenswrapper[5200]: I1126 14:48:39.210026 5200 generic.go:358] "Generic (PLEG): container finished" podID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerID="4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2" exitCode=0 Nov 26 14:48:39 crc kubenswrapper[5200]: I1126 14:48:39.210119 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbsqz" event={"ID":"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa","Type":"ContainerDied","Data":"4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2"} Nov 26 14:48:39 crc kubenswrapper[5200]: I1126 14:48:39.210447 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbsqz" event={"ID":"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa","Type":"ContainerStarted","Data":"c081f0a5a2d9e35719267da8214f63e361030df4fddf40e59ccd7d58fafe47c3"} Nov 26 14:48:41 crc kubenswrapper[5200]: I1126 14:48:41.233128 5200 generic.go:358] "Generic (PLEG): container finished" podID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerID="f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4" exitCode=0 Nov 26 14:48:41 crc kubenswrapper[5200]: I1126 14:48:41.233195 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbsqz" event={"ID":"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa","Type":"ContainerDied","Data":"f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4"} Nov 26 14:48:41 crc kubenswrapper[5200]: E1126 14:48:41.915067 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1175953 actualBytes=10240 Nov 26 14:48:42 crc kubenswrapper[5200]: I1126 14:48:42.245954 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbsqz" event={"ID":"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa","Type":"ContainerStarted","Data":"ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772"} Nov 26 14:48:42 crc kubenswrapper[5200]: I1126 14:48:42.272020 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qbsqz" podStartSLOduration=4.263514873 podStartE2EDuration="5.271999922s" podCreationTimestamp="2025-11-26 14:48:37 +0000 UTC" firstStartedPulling="2025-11-26 14:48:39.212278675 +0000 UTC m=+4687.891480533" lastFinishedPulling="2025-11-26 14:48:40.220763734 +0000 UTC m=+4688.899965582" observedRunningTime="2025-11-26 14:48:42.264499803 +0000 UTC m=+4690.943701651" watchObservedRunningTime="2025-11-26 14:48:42.271999922 +0000 UTC m=+4690.951201740" Nov 26 14:48:48 crc kubenswrapper[5200]: I1126 14:48:48.078757 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:48 crc kubenswrapper[5200]: I1126 14:48:48.079976 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:48 crc kubenswrapper[5200]: I1126 14:48:48.124319 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:48 crc kubenswrapper[5200]: I1126 14:48:48.329088 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:48 crc kubenswrapper[5200]: I1126 14:48:48.371038 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbsqz"] Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.709984 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8598cb5bcf-ss6v7"] Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.723376 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.726568 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openshift-service-ca.crt\"" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.726727 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"dns\"" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.726835 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"dns-svc\"" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.726978 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dnsmasq-dns-dockercfg-2q2gm\"" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.729027 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"kube-root-ca.crt\"" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.730876 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8598cb5bcf-ss6v7"] Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.797834 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-config\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.797893 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-dns-svc\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.798151 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpk4j\" (UniqueName: \"kubernetes.io/projected/f14e5fe6-d9d8-4111-a6fb-db042238318a-kube-api-access-cpk4j\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.899515 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpk4j\" (UniqueName: \"kubernetes.io/projected/f14e5fe6-d9d8-4111-a6fb-db042238318a-kube-api-access-cpk4j\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.899621 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-config\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.899649 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-dns-svc\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.900934 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-config\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.900954 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-dns-svc\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.925827 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpk4j\" (UniqueName: \"kubernetes.io/projected/f14e5fe6-d9d8-4111-a6fb-db042238318a-kube-api-access-cpk4j\") pod \"dnsmasq-dns-8598cb5bcf-ss6v7\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.965466 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5798f4c845-cz7sk"] Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.976999 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:49 crc kubenswrapper[5200]: I1126 14:48:49.979331 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5798f4c845-cz7sk"] Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.046256 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.124709 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-dns-svc\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.125203 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-config\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.125228 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6fc\" (UniqueName: \"kubernetes.io/projected/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-kube-api-access-bh6fc\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.226965 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-dns-svc\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.227084 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-config\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.227108 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6fc\" (UniqueName: \"kubernetes.io/projected/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-kube-api-access-bh6fc\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.228204 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-dns-svc\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.228739 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-config\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.255578 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6fc\" (UniqueName: \"kubernetes.io/projected/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-kube-api-access-bh6fc\") pod \"dnsmasq-dns-5798f4c845-cz7sk\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.301527 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qbsqz" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerName="registry-server" containerID="cri-o://ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772" gracePeriod=2 Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.339237 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.518393 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8598cb5bcf-ss6v7"] Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.644406 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.733537 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5798f4c845-cz7sk"] Nov 26 14:48:50 crc kubenswrapper[5200]: W1126 14:48:50.734857 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a1aef2e_6ca3_4516_aeee_fe6cda2c6646.slice/crio-889d76ab7b46906fc4f692a646f8c849ae1815f2d4fb5941195a49ece4306a06 WatchSource:0}: Error finding container 889d76ab7b46906fc4f692a646f8c849ae1815f2d4fb5941195a49ece4306a06: Status 404 returned error can't find the container with id 889d76ab7b46906fc4f692a646f8c849ae1815f2d4fb5941195a49ece4306a06 Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.734878 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-catalog-content\") pod \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.734981 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-utilities\") pod \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.735073 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twccz\" (UniqueName: \"kubernetes.io/projected/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-kube-api-access-twccz\") pod \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\" (UID: \"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa\") " Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.735986 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-utilities" (OuterVolumeSpecName: "utilities") pod "88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" (UID: "88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.741755 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-kube-api-access-twccz" (OuterVolumeSpecName: "kube-api-access-twccz") pod "88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" (UID: "88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa"). InnerVolumeSpecName "kube-api-access-twccz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.836508 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twccz\" (UniqueName: \"kubernetes.io/projected/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-kube-api-access-twccz\") on node \"crc\" DevicePath \"\"" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.836808 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.860784 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.862278 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerName="extract-utilities" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.862313 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerName="extract-utilities" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.862324 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerName="registry-server" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.862332 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerName="registry-server" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.862352 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerName="extract-content" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.862357 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerName="extract-content" Nov 26 14:48:50 crc kubenswrapper[5200]: I1126 14:48:50.862515 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerName="registry-server" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.072255 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.072528 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.076910 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-server-conf\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.077147 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-server-dockercfg-68vn9\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.077212 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-plugins-conf\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.077286 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-erlang-cookie\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.081192 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-default-user\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.095100 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" (UID: "88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.141548 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.150083 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.161050 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.162984 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-default-user\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.163656 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.164454 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-dockercfg-lqkrx\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.164539 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-conf\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.164763 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-plugins-conf\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.164789 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-erlang-cookie\"" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242351 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242396 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242428 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242452 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242619 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee40693c-087a-49b3-9a84-cbc50c66b21e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242726 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87a0d855-31af-4e13-8461-4afd0f0edaaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242759 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242808 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242837 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242883 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242935 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq9l6\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-kube-api-access-wq9l6\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.242989 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee40693c-087a-49b3-9a84-cbc50c66b21e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.243031 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.243098 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87a0d855-31af-4e13-8461-4afd0f0edaaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.243133 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf4sc\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-kube-api-access-lf4sc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.243188 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.243215 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.243279 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.309792 5200 generic.go:358] "Generic (PLEG): container finished" podID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" containerID="ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772" exitCode=0 Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.309824 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbsqz" event={"ID":"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa","Type":"ContainerDied","Data":"ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772"} Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.309892 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbsqz" event={"ID":"88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa","Type":"ContainerDied","Data":"c081f0a5a2d9e35719267da8214f63e361030df4fddf40e59ccd7d58fafe47c3"} Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.309912 5200 scope.go:117] "RemoveContainer" containerID="ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.309912 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbsqz" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.311982 5200 generic.go:358] "Generic (PLEG): container finished" podID="f14e5fe6-d9d8-4111-a6fb-db042238318a" containerID="0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b" exitCode=0 Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.312166 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" event={"ID":"f14e5fe6-d9d8-4111-a6fb-db042238318a","Type":"ContainerDied","Data":"0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b"} Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.312186 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" event={"ID":"f14e5fe6-d9d8-4111-a6fb-db042238318a","Type":"ContainerStarted","Data":"22045fa10aa2375bd13cdf153c677f9c45bb2ffeff7d4d22da43ba9f0ae36396"} Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.320015 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" event={"ID":"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646","Type":"ContainerStarted","Data":"889d76ab7b46906fc4f692a646f8c849ae1815f2d4fb5941195a49ece4306a06"} Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344145 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344384 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wq9l6\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-kube-api-access-wq9l6\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344406 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee40693c-087a-49b3-9a84-cbc50c66b21e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344422 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344451 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87a0d855-31af-4e13-8461-4afd0f0edaaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344474 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf4sc\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-kube-api-access-lf4sc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344499 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344519 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344535 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344556 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344569 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344590 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344600 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344608 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344676 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee40693c-087a-49b3-9a84-cbc50c66b21e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344846 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87a0d855-31af-4e13-8461-4afd0f0edaaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344872 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344911 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.344943 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.345440 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.345656 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.346046 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.347991 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.348263 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.348468 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.349305 5200 scope.go:117] "RemoveContainer" containerID="f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.349389 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.349667 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.353992 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.354036 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2f1017e1977e01fab334d49bcc5d7405af4398edd347a7a80d8f13ccf4900573/globalmount\"" pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.355389 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.355431 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b3e74fd8d458bb9aaf3401e4a41a44475872a1d27ea67bd524df4aa3945308e1/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.357892 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee40693c-087a-49b3-9a84-cbc50c66b21e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.361037 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee40693c-087a-49b3-9a84-cbc50c66b21e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.370222 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87a0d855-31af-4e13-8461-4afd0f0edaaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.370359 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87a0d855-31af-4e13-8461-4afd0f0edaaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.370622 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.375728 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq9l6\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-kube-api-access-wq9l6\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.376627 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf4sc\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-kube-api-access-lf4sc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.411815 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"rabbitmq-cell1-server-0\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.414313 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"rabbitmq-server-0\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " pod="openstack/rabbitmq-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.477958 5200 scope.go:117] "RemoveContainer" containerID="4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.501857 5200 scope.go:117] "RemoveContainer" containerID="ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772" Nov 26 14:48:51 crc kubenswrapper[5200]: E1126 14:48:51.502476 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772\": container with ID starting with ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772 not found: ID does not exist" containerID="ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.502514 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772"} err="failed to get container status \"ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772\": rpc error: code = NotFound desc = could not find container \"ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772\": container with ID starting with ffb4ae0faeb95d145f76d2d7536ebff7457b1beb4ac40ceaf5712fb66f522772 not found: ID does not exist" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.502538 5200 scope.go:117] "RemoveContainer" containerID="f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4" Nov 26 14:48:51 crc kubenswrapper[5200]: E1126 14:48:51.502938 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4\": container with ID starting with f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4 not found: ID does not exist" containerID="f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.502970 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4"} err="failed to get container status \"f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4\": rpc error: code = NotFound desc = could not find container \"f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4\": container with ID starting with f651af9ff18b644fbc05b416a70a73f461ca19034827e2efbfa835eb2ae504d4 not found: ID does not exist" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.502992 5200 scope.go:117] "RemoveContainer" containerID="4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2" Nov 26 14:48:51 crc kubenswrapper[5200]: E1126 14:48:51.503259 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2\": container with ID starting with 4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2 not found: ID does not exist" containerID="4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.503286 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2"} err="failed to get container status \"4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2\": rpc error: code = NotFound desc = could not find container \"4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2\": container with ID starting with 4874209b24b85945c22cb2e49dbea8e70c390ee94c5621a71eb042d539de04a2 not found: ID does not exist" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.511927 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbsqz"] Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.517903 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qbsqz"] Nov 26 14:48:51 crc kubenswrapper[5200]: E1126 14:48:51.557442 5200 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 26 14:48:51 crc kubenswrapper[5200]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/f14e5fe6-d9d8-4111-a6fb-db042238318a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 26 14:48:51 crc kubenswrapper[5200]: > podSandboxID="22045fa10aa2375bd13cdf153c677f9c45bb2ffeff7d4d22da43ba9f0ae36396" Nov 26 14:48:51 crc kubenswrapper[5200]: E1126 14:48:51.557630 5200 kuberuntime_manager.go:1358] "Unhandled Error" err=< Nov 26 14:48:51 crc kubenswrapper[5200]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpk4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8598cb5bcf-ss6v7_openstack(f14e5fe6-d9d8-4111-a6fb-db042238318a): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/f14e5fe6-d9d8-4111-a6fb-db042238318a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 26 14:48:51 crc kubenswrapper[5200]: > logger="UnhandledError" Nov 26 14:48:51 crc kubenswrapper[5200]: E1126 14:48:51.558745 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/f14e5fe6-d9d8-4111-a6fb-db042238318a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" podUID="f14e5fe6-d9d8-4111-a6fb-db042238318a" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.572086 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:48:51 crc kubenswrapper[5200]: I1126 14:48:51.699436 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 14:48:52 crc kubenswrapper[5200]: W1126 14:48:52.002495 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee40693c_087a_49b3_9a84_cbc50c66b21e.slice/crio-a26abf90785bbf6e7958a75fbb7ce3201e144b8a0cdf5a93de6d94aae220434e WatchSource:0}: Error finding container a26abf90785bbf6e7958a75fbb7ce3201e144b8a0cdf5a93de6d94aae220434e: Status 404 returned error can't find the container with id a26abf90785bbf6e7958a75fbb7ce3201e144b8a0cdf5a93de6d94aae220434e Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.002919 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.113556 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:48:52 crc kubenswrapper[5200]: W1126 14:48:52.120113 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87a0d855_31af_4e13_8461_4afd0f0edaaf.slice/crio-b7fd13a310767f3a6b0caac7371cc18188a3a0d13e8fb7de0ce77ebe59d5bc15 WatchSource:0}: Error finding container b7fd13a310767f3a6b0caac7371cc18188a3a0d13e8fb7de0ce77ebe59d5bc15: Status 404 returned error can't find the container with id b7fd13a310767f3a6b0caac7371cc18188a3a0d13e8fb7de0ce77ebe59d5bc15 Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.272966 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.279394 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.285393 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"galera-openstack-dockercfg-6xxln\"" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.286040 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-scripts\"" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.286338 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-galera-openstack-svc\"" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.289100 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"combined-ca-bundle\"" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.289278 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-config-data\"" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.293373 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.347289 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ee40693c-087a-49b3-9a84-cbc50c66b21e","Type":"ContainerStarted","Data":"a26abf90785bbf6e7958a75fbb7ce3201e144b8a0cdf5a93de6d94aae220434e"} Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.349083 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87a0d855-31af-4e13-8461-4afd0f0edaaf","Type":"ContainerStarted","Data":"b7fd13a310767f3a6b0caac7371cc18188a3a0d13e8fb7de0ce77ebe59d5bc15"} Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.351069 5200 generic.go:358] "Generic (PLEG): container finished" podID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerID="ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf" exitCode=0 Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.351129 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" event={"ID":"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646","Type":"ContainerDied","Data":"ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf"} Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.360437 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-kolla-config\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.360507 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb08e317-49b5-4bda-9130-0cb6c5780603-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.360539 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-config-data-default\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.360570 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb08e317-49b5-4bda-9130-0cb6c5780603-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.360605 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrt79\" (UniqueName: \"kubernetes.io/projected/bb08e317-49b5-4bda-9130-0cb6c5780603-kube-api-access-hrt79\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.360642 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb08e317-49b5-4bda-9130-0cb6c5780603-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.360710 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-acdaa118-2d0a-41ca-a0b9-58155a6d9f5f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acdaa118-2d0a-41ca-a0b9-58155a6d9f5f\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.360769 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.462926 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-acdaa118-2d0a-41ca-a0b9-58155a6d9f5f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acdaa118-2d0a-41ca-a0b9-58155a6d9f5f\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.463031 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.463208 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-kolla-config\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.463300 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb08e317-49b5-4bda-9130-0cb6c5780603-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.463332 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-config-data-default\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.463382 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb08e317-49b5-4bda-9130-0cb6c5780603-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.463430 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrt79\" (UniqueName: \"kubernetes.io/projected/bb08e317-49b5-4bda-9130-0cb6c5780603-kube-api-access-hrt79\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.463482 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb08e317-49b5-4bda-9130-0cb6c5780603-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.463870 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-kolla-config\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.464195 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb08e317-49b5-4bda-9130-0cb6c5780603-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.465047 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-config-data-default\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.465320 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb08e317-49b5-4bda-9130-0cb6c5780603-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.467670 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb08e317-49b5-4bda-9130-0cb6c5780603-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.467825 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb08e317-49b5-4bda-9130-0cb6c5780603-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.478519 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.478562 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-acdaa118-2d0a-41ca-a0b9-58155a6d9f5f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acdaa118-2d0a-41ca-a0b9-58155a6d9f5f\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/79db370557ac407424700823dc40a4c9c96bf55024730e00861c4c8448a70d6d/globalmount\"" pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.486697 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrt79\" (UniqueName: \"kubernetes.io/projected/bb08e317-49b5-4bda-9130-0cb6c5780603-kube-api-access-hrt79\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.515008 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-acdaa118-2d0a-41ca-a0b9-58155a6d9f5f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acdaa118-2d0a-41ca-a0b9-58155a6d9f5f\") pod \"openstack-galera-0\" (UID: \"bb08e317-49b5-4bda-9130-0cb6c5780603\") " pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.654439 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.813880 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.822806 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.837547 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"memcached-config-data\"" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.837779 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"memcached-memcached-dockercfg-4fmzz\"" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.844024 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.869601 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca944d31-1c29-4808-aed1-09dcdd775199-kolla-config\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.869730 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca944d31-1c29-4808-aed1-09dcdd775199-config-data\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.869758 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6gvk\" (UniqueName: \"kubernetes.io/projected/ca944d31-1c29-4808-aed1-09dcdd775199-kube-api-access-g6gvk\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.935581 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.971203 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6gvk\" (UniqueName: \"kubernetes.io/projected/ca944d31-1c29-4808-aed1-09dcdd775199-kube-api-access-g6gvk\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.971298 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca944d31-1c29-4808-aed1-09dcdd775199-kolla-config\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.971433 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca944d31-1c29-4808-aed1-09dcdd775199-config-data\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.972410 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca944d31-1c29-4808-aed1-09dcdd775199-config-data\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.972802 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca944d31-1c29-4808-aed1-09dcdd775199-kolla-config\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.977289 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa" path="/var/lib/kubelet/pods/88f9a73c-f85c-4bde-9229-ed9e7e5bcbfa/volumes" Nov 26 14:48:52 crc kubenswrapper[5200]: W1126 14:48:52.984161 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb08e317_49b5_4bda_9130_0cb6c5780603.slice/crio-529e0ae06e4e89b790811bb2c972266aa94250e49270dbedc529033845f6148e WatchSource:0}: Error finding container 529e0ae06e4e89b790811bb2c972266aa94250e49270dbedc529033845f6148e: Status 404 returned error can't find the container with id 529e0ae06e4e89b790811bb2c972266aa94250e49270dbedc529033845f6148e Nov 26 14:48:52 crc kubenswrapper[5200]: I1126 14:48:52.991322 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6gvk\" (UniqueName: \"kubernetes.io/projected/ca944d31-1c29-4808-aed1-09dcdd775199-kube-api-access-g6gvk\") pod \"memcached-0\" (UID: \"ca944d31-1c29-4808-aed1-09dcdd775199\") " pod="openstack/memcached-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.149785 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.370229 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ee40693c-087a-49b3-9a84-cbc50c66b21e","Type":"ContainerStarted","Data":"de09db6c53e17347b897453ad34d13b27a28189c2eccae635c6f6f8add67f68d"} Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.380280 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87a0d855-31af-4e13-8461-4afd0f0edaaf","Type":"ContainerStarted","Data":"6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a"} Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.404503 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" event={"ID":"f14e5fe6-d9d8-4111-a6fb-db042238318a","Type":"ContainerStarted","Data":"7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23"} Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.404863 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.428104 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" event={"ID":"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646","Type":"ContainerStarted","Data":"3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317"} Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.428799 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.443878 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb08e317-49b5-4bda-9130-0cb6c5780603","Type":"ContainerStarted","Data":"529e0ae06e4e89b790811bb2c972266aa94250e49270dbedc529033845f6148e"} Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.507292 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" podStartSLOduration=4.507266418 podStartE2EDuration="4.507266418s" podCreationTimestamp="2025-11-26 14:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:48:53.499169913 +0000 UTC m=+4702.178371731" watchObservedRunningTime="2025-11-26 14:48:53.507266418 +0000 UTC m=+4702.186468236" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.557604 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" podStartSLOduration=4.55758358 podStartE2EDuration="4.55758358s" podCreationTimestamp="2025-11-26 14:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:48:53.526156468 +0000 UTC m=+4702.205358286" watchObservedRunningTime="2025-11-26 14:48:53.55758358 +0000 UTC m=+4702.236785398" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.558705 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.868606 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.874200 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.878931 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"galera-openstack-cell1-dockercfg-xn7tg\"" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.878931 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-cell1-config-data\"" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.879143 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-galera-openstack-cell1-svc\"" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.880845 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-cell1-scripts\"" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.881814 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.989120 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5299e9c2-94ad-423b-8b68-790b3554d0e0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.989208 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.989293 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf422\" (UniqueName: \"kubernetes.io/projected/5299e9c2-94ad-423b-8b68-790b3554d0e0-kube-api-access-hf422\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.989340 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.989487 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.989674 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5299e9c2-94ad-423b-8b68-790b3554d0e0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.989845 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9408eda1-8a95-4e2c-9a57-f3ddbd993a4a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9408eda1-8a95-4e2c-9a57-f3ddbd993a4a\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:53 crc kubenswrapper[5200]: I1126 14:48:53.989951 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5299e9c2-94ad-423b-8b68-790b3554d0e0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.091487 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5299e9c2-94ad-423b-8b68-790b3554d0e0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.091568 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.091615 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf422\" (UniqueName: \"kubernetes.io/projected/5299e9c2-94ad-423b-8b68-790b3554d0e0-kube-api-access-hf422\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.091641 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.091669 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.091725 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5299e9c2-94ad-423b-8b68-790b3554d0e0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.092060 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-9408eda1-8a95-4e2c-9a57-f3ddbd993a4a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9408eda1-8a95-4e2c-9a57-f3ddbd993a4a\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.092217 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5299e9c2-94ad-423b-8b68-790b3554d0e0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.092672 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5299e9c2-94ad-423b-8b68-790b3554d0e0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.092901 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.093090 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.093764 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5299e9c2-94ad-423b-8b68-790b3554d0e0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.097170 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5299e9c2-94ad-423b-8b68-790b3554d0e0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.097831 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5299e9c2-94ad-423b-8b68-790b3554d0e0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.106630 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.106669 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-9408eda1-8a95-4e2c-9a57-f3ddbd993a4a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9408eda1-8a95-4e2c-9a57-f3ddbd993a4a\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2eafc244d704ea510df25ac2cc12acfe45b59e55ceee4deb6edb3c9ae4e770f5/globalmount\"" pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.113301 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf422\" (UniqueName: \"kubernetes.io/projected/5299e9c2-94ad-423b-8b68-790b3554d0e0-kube-api-access-hf422\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.149810 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-9408eda1-8a95-4e2c-9a57-f3ddbd993a4a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9408eda1-8a95-4e2c-9a57-f3ddbd993a4a\") pod \"openstack-cell1-galera-0\" (UID: \"5299e9c2-94ad-423b-8b68-790b3554d0e0\") " pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.200464 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.453426 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ca944d31-1c29-4808-aed1-09dcdd775199","Type":"ContainerStarted","Data":"ef653d406f4cc3898311b288b7bfe0b08dc35cdc1bb8bbc13843a4bdafd40d02"} Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.453499 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ca944d31-1c29-4808-aed1-09dcdd775199","Type":"ContainerStarted","Data":"8cdba6835213bd6618f2dfc4551f77c5fd5d63fd64e002e37a28f4b8c27ca8ac"} Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.453661 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/memcached-0" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.455892 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb08e317-49b5-4bda-9130-0cb6c5780603","Type":"ContainerStarted","Data":"7ed0632d183cfb0b70f9bc152ae73f3129f7ee10004bf98be8df60c369a1a1ec"} Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.498586 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.498563501 podStartE2EDuration="2.498563501s" podCreationTimestamp="2025-11-26 14:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:48:54.473947259 +0000 UTC m=+4703.153149077" watchObservedRunningTime="2025-11-26 14:48:54.498563501 +0000 UTC m=+4703.177765329" Nov 26 14:48:54 crc kubenswrapper[5200]: I1126 14:48:54.673151 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 14:48:54 crc kubenswrapper[5200]: W1126 14:48:54.677160 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5299e9c2_94ad_423b_8b68_790b3554d0e0.slice/crio-f21715b8104bde276bb0fed443d42863164f43be59d82ee68eff646a3a2a3b13 WatchSource:0}: Error finding container f21715b8104bde276bb0fed443d42863164f43be59d82ee68eff646a3a2a3b13: Status 404 returned error can't find the container with id f21715b8104bde276bb0fed443d42863164f43be59d82ee68eff646a3a2a3b13 Nov 26 14:48:55 crc kubenswrapper[5200]: I1126 14:48:55.466852 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5299e9c2-94ad-423b-8b68-790b3554d0e0","Type":"ContainerStarted","Data":"0fc03fbe9457cff1621c4e415e10c164fde66ae0afc43751d02c40e7c9c78bb9"} Nov 26 14:48:55 crc kubenswrapper[5200]: I1126 14:48:55.467180 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5299e9c2-94ad-423b-8b68-790b3554d0e0","Type":"ContainerStarted","Data":"f21715b8104bde276bb0fed443d42863164f43be59d82ee68eff646a3a2a3b13"} Nov 26 14:48:57 crc kubenswrapper[5200]: I1126 14:48:57.483343 5200 generic.go:358] "Generic (PLEG): container finished" podID="bb08e317-49b5-4bda-9130-0cb6c5780603" containerID="7ed0632d183cfb0b70f9bc152ae73f3129f7ee10004bf98be8df60c369a1a1ec" exitCode=0 Nov 26 14:48:57 crc kubenswrapper[5200]: I1126 14:48:57.483435 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb08e317-49b5-4bda-9130-0cb6c5780603","Type":"ContainerDied","Data":"7ed0632d183cfb0b70f9bc152ae73f3129f7ee10004bf98be8df60c369a1a1ec"} Nov 26 14:48:58 crc kubenswrapper[5200]: I1126 14:48:58.492893 5200 generic.go:358] "Generic (PLEG): container finished" podID="5299e9c2-94ad-423b-8b68-790b3554d0e0" containerID="0fc03fbe9457cff1621c4e415e10c164fde66ae0afc43751d02c40e7c9c78bb9" exitCode=0 Nov 26 14:48:58 crc kubenswrapper[5200]: I1126 14:48:58.492967 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5299e9c2-94ad-423b-8b68-790b3554d0e0","Type":"ContainerDied","Data":"0fc03fbe9457cff1621c4e415e10c164fde66ae0afc43751d02c40e7c9c78bb9"} Nov 26 14:48:58 crc kubenswrapper[5200]: I1126 14:48:58.498979 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb08e317-49b5-4bda-9130-0cb6c5780603","Type":"ContainerStarted","Data":"51e3ae477d6e6136d0b6d0fdf62e50770864a2b1442fc10a38f84718ff833b85"} Nov 26 14:48:58 crc kubenswrapper[5200]: I1126 14:48:58.543598 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.543579705 podStartE2EDuration="7.543579705s" podCreationTimestamp="2025-11-26 14:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:48:58.539979299 +0000 UTC m=+4707.219181127" watchObservedRunningTime="2025-11-26 14:48:58.543579705 +0000 UTC m=+4707.222781523" Nov 26 14:48:59 crc kubenswrapper[5200]: I1126 14:48:59.457592 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:48:59 crc kubenswrapper[5200]: I1126 14:48:59.507786 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5299e9c2-94ad-423b-8b68-790b3554d0e0","Type":"ContainerStarted","Data":"5b3dfcab80f1fcb9c2f71e5b90b582fcebd7724ed0fc18fbd73a6959b3c0947c"} Nov 26 14:48:59 crc kubenswrapper[5200]: I1126 14:48:59.536890 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.536674657 podStartE2EDuration="7.536674657s" podCreationTimestamp="2025-11-26 14:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:48:59.532838935 +0000 UTC m=+4708.212040763" watchObservedRunningTime="2025-11-26 14:48:59.536674657 +0000 UTC m=+4708.215876485" Nov 26 14:49:00 crc kubenswrapper[5200]: I1126 14:49:00.468825 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:49:00 crc kubenswrapper[5200]: I1126 14:49:00.469131 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 26 14:49:00 crc kubenswrapper[5200]: I1126 14:49:00.546195 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8598cb5bcf-ss6v7"] Nov 26 14:49:00 crc kubenswrapper[5200]: I1126 14:49:00.546915 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" podUID="f14e5fe6-d9d8-4111-a6fb-db042238318a" containerName="dnsmasq-dns" containerID="cri-o://7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23" gracePeriod=10 Nov 26 14:49:00 crc kubenswrapper[5200]: I1126 14:49:00.968019 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.131699 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-dns-svc\") pod \"f14e5fe6-d9d8-4111-a6fb-db042238318a\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.131854 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpk4j\" (UniqueName: \"kubernetes.io/projected/f14e5fe6-d9d8-4111-a6fb-db042238318a-kube-api-access-cpk4j\") pod \"f14e5fe6-d9d8-4111-a6fb-db042238318a\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.131887 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-config\") pod \"f14e5fe6-d9d8-4111-a6fb-db042238318a\" (UID: \"f14e5fe6-d9d8-4111-a6fb-db042238318a\") " Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.141848 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f14e5fe6-d9d8-4111-a6fb-db042238318a-kube-api-access-cpk4j" (OuterVolumeSpecName: "kube-api-access-cpk4j") pod "f14e5fe6-d9d8-4111-a6fb-db042238318a" (UID: "f14e5fe6-d9d8-4111-a6fb-db042238318a"). InnerVolumeSpecName "kube-api-access-cpk4j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.166400 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f14e5fe6-d9d8-4111-a6fb-db042238318a" (UID: "f14e5fe6-d9d8-4111-a6fb-db042238318a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.179151 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-config" (OuterVolumeSpecName: "config") pod "f14e5fe6-d9d8-4111-a6fb-db042238318a" (UID: "f14e5fe6-d9d8-4111-a6fb-db042238318a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.234184 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cpk4j\" (UniqueName: \"kubernetes.io/projected/f14e5fe6-d9d8-4111-a6fb-db042238318a-kube-api-access-cpk4j\") on node \"crc\" DevicePath \"\"" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.234475 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.234485 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f14e5fe6-d9d8-4111-a6fb-db042238318a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.526814 5200 generic.go:358] "Generic (PLEG): container finished" podID="f14e5fe6-d9d8-4111-a6fb-db042238318a" containerID="7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23" exitCode=0 Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.526858 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" event={"ID":"f14e5fe6-d9d8-4111-a6fb-db042238318a","Type":"ContainerDied","Data":"7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23"} Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.526901 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" event={"ID":"f14e5fe6-d9d8-4111-a6fb-db042238318a","Type":"ContainerDied","Data":"22045fa10aa2375bd13cdf153c677f9c45bb2ffeff7d4d22da43ba9f0ae36396"} Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.526922 5200 scope.go:117] "RemoveContainer" containerID="7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.526938 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8598cb5bcf-ss6v7" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.552960 5200 scope.go:117] "RemoveContainer" containerID="0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.572921 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8598cb5bcf-ss6v7"] Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.573248 5200 scope.go:117] "RemoveContainer" containerID="7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23" Nov 26 14:49:01 crc kubenswrapper[5200]: E1126 14:49:01.573727 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23\": container with ID starting with 7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23 not found: ID does not exist" containerID="7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.573775 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23"} err="failed to get container status \"7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23\": rpc error: code = NotFound desc = could not find container \"7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23\": container with ID starting with 7d07f5c2ebd70a33304df062179e3c62b1585029668d691649d376e9cf296c23 not found: ID does not exist" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.573808 5200 scope.go:117] "RemoveContainer" containerID="0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b" Nov 26 14:49:01 crc kubenswrapper[5200]: E1126 14:49:01.574948 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b\": container with ID starting with 0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b not found: ID does not exist" containerID="0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.574978 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b"} err="failed to get container status \"0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b\": rpc error: code = NotFound desc = could not find container \"0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b\": container with ID starting with 0bc95476a40d0bb0d5776fa344f51a52106500a7735adcb005d77e4ec2083d6b not found: ID does not exist" Nov 26 14:49:01 crc kubenswrapper[5200]: I1126 14:49:01.580030 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8598cb5bcf-ss6v7"] Nov 26 14:49:02 crc kubenswrapper[5200]: I1126 14:49:02.655264 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/openstack-galera-0" Nov 26 14:49:02 crc kubenswrapper[5200]: I1126 14:49:02.655648 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 26 14:49:02 crc kubenswrapper[5200]: I1126 14:49:02.980013 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f14e5fe6-d9d8-4111-a6fb-db042238318a" path="/var/lib/kubelet/pods/f14e5fe6-d9d8-4111-a6fb-db042238318a/volumes" Nov 26 14:49:04 crc kubenswrapper[5200]: I1126 14:49:04.201618 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 26 14:49:04 crc kubenswrapper[5200]: I1126 14:49:04.201727 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/openstack-cell1-galera-0" Nov 26 14:49:05 crc kubenswrapper[5200]: I1126 14:49:05.179405 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 26 14:49:05 crc kubenswrapper[5200]: I1126 14:49:05.255425 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 26 14:49:06 crc kubenswrapper[5200]: I1126 14:49:06.426484 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 26 14:49:06 crc kubenswrapper[5200]: I1126 14:49:06.504298 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 26 14:49:25 crc kubenswrapper[5200]: I1126 14:49:25.976750 5200 generic.go:358] "Generic (PLEG): container finished" podID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerID="de09db6c53e17347b897453ad34d13b27a28189c2eccae635c6f6f8add67f68d" exitCode=0 Nov 26 14:49:25 crc kubenswrapper[5200]: I1126 14:49:25.977427 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ee40693c-087a-49b3-9a84-cbc50c66b21e","Type":"ContainerDied","Data":"de09db6c53e17347b897453ad34d13b27a28189c2eccae635c6f6f8add67f68d"} Nov 26 14:49:25 crc kubenswrapper[5200]: I1126 14:49:25.981482 5200 generic.go:358] "Generic (PLEG): container finished" podID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerID="6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a" exitCode=0 Nov 26 14:49:25 crc kubenswrapper[5200]: I1126 14:49:25.981880 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87a0d855-31af-4e13-8461-4afd0f0edaaf","Type":"ContainerDied","Data":"6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a"} Nov 26 14:49:26 crc kubenswrapper[5200]: I1126 14:49:26.991201 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ee40693c-087a-49b3-9a84-cbc50c66b21e","Type":"ContainerStarted","Data":"50dda4812a7654b4bc1fff6e9728feb007b1d7c24cdd75ff8483579ea69e3846"} Nov 26 14:49:26 crc kubenswrapper[5200]: I1126 14:49:26.993260 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87a0d855-31af-4e13-8461-4afd0f0edaaf","Type":"ContainerStarted","Data":"698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5"} Nov 26 14:49:26 crc kubenswrapper[5200]: I1126 14:49:26.993537 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:49:26 crc kubenswrapper[5200]: I1126 14:49:26.994556 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-server-0" Nov 26 14:49:27 crc kubenswrapper[5200]: I1126 14:49:27.029559 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.02953931 podStartE2EDuration="37.02953931s" podCreationTimestamp="2025-11-26 14:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:49:27.027532277 +0000 UTC m=+4735.706734105" watchObservedRunningTime="2025-11-26 14:49:27.02953931 +0000 UTC m=+4735.708741138" Nov 26 14:49:27 crc kubenswrapper[5200]: I1126 14:49:27.047745 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.04772816 podStartE2EDuration="38.04772816s" podCreationTimestamp="2025-11-26 14:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:49:27.045649665 +0000 UTC m=+4735.724851483" watchObservedRunningTime="2025-11-26 14:49:27.04772816 +0000 UTC m=+4735.726929978" Nov 26 14:49:38 crc kubenswrapper[5200]: I1126 14:49:38.004103 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.238:5672: connect: connection refused" Nov 26 14:49:38 crc kubenswrapper[5200]: I1126 14:49:38.004877 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.239:5672: connect: connection refused" Nov 26 14:49:42 crc kubenswrapper[5200]: E1126 14:49:42.246511 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1304231 actualBytes=10240 Nov 26 14:49:48 crc kubenswrapper[5200]: I1126 14:49:48.002572 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 26 14:49:48 crc kubenswrapper[5200]: I1126 14:49:48.003364 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.607377 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77c584ff65-fbljb"] Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.608937 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f14e5fe6-d9d8-4111-a6fb-db042238318a" containerName="init" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.608956 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14e5fe6-d9d8-4111-a6fb-db042238318a" containerName="init" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.609006 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f14e5fe6-d9d8-4111-a6fb-db042238318a" containerName="dnsmasq-dns" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.609018 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14e5fe6-d9d8-4111-a6fb-db042238318a" containerName="dnsmasq-dns" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.609277 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f14e5fe6-d9d8-4111-a6fb-db042238318a" containerName="dnsmasq-dns" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.618095 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.634333 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c584ff65-fbljb"] Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.736892 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-dns-svc\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.737485 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-config\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.737559 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7ph\" (UniqueName: \"kubernetes.io/projected/21fdd50d-227d-4dac-9f8e-71192a857118-kube-api-access-ss7ph\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.839871 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-config\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.839986 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7ph\" (UniqueName: \"kubernetes.io/projected/21fdd50d-227d-4dac-9f8e-71192a857118-kube-api-access-ss7ph\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.840763 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-dns-svc\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.841466 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-config\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.841967 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-dns-svc\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.865949 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7ph\" (UniqueName: \"kubernetes.io/projected/21fdd50d-227d-4dac-9f8e-71192a857118-kube-api-access-ss7ph\") pod \"dnsmasq-dns-77c584ff65-fbljb\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:51 crc kubenswrapper[5200]: I1126 14:49:51.943658 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:52 crc kubenswrapper[5200]: I1126 14:49:52.408209 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:49:52 crc kubenswrapper[5200]: I1126 14:49:52.580118 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c584ff65-fbljb"] Nov 26 14:49:52 crc kubenswrapper[5200]: W1126 14:49:52.585533 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21fdd50d_227d_4dac_9f8e_71192a857118.slice/crio-6169f5bc21ff14db0c722e17cdad40c184c73c46f0c39900cc0b110b8f4b953c WatchSource:0}: Error finding container 6169f5bc21ff14db0c722e17cdad40c184c73c46f0c39900cc0b110b8f4b953c: Status 404 returned error can't find the container with id 6169f5bc21ff14db0c722e17cdad40c184c73c46f0c39900cc0b110b8f4b953c Nov 26 14:49:52 crc kubenswrapper[5200]: I1126 14:49:52.962289 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:49:53 crc kubenswrapper[5200]: I1126 14:49:53.208595 5200 generic.go:358] "Generic (PLEG): container finished" podID="21fdd50d-227d-4dac-9f8e-71192a857118" containerID="ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48" exitCode=0 Nov 26 14:49:53 crc kubenswrapper[5200]: I1126 14:49:53.208752 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" event={"ID":"21fdd50d-227d-4dac-9f8e-71192a857118","Type":"ContainerDied","Data":"ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48"} Nov 26 14:49:53 crc kubenswrapper[5200]: I1126 14:49:53.208802 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" event={"ID":"21fdd50d-227d-4dac-9f8e-71192a857118","Type":"ContainerStarted","Data":"6169f5bc21ff14db0c722e17cdad40c184c73c46f0c39900cc0b110b8f4b953c"} Nov 26 14:49:54 crc kubenswrapper[5200]: I1126 14:49:54.222037 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" event={"ID":"21fdd50d-227d-4dac-9f8e-71192a857118","Type":"ContainerStarted","Data":"ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7"} Nov 26 14:49:54 crc kubenswrapper[5200]: I1126 14:49:54.222738 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:49:54 crc kubenswrapper[5200]: I1126 14:49:54.243180 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" podStartSLOduration=3.243143315 podStartE2EDuration="3.243143315s" podCreationTimestamp="2025-11-26 14:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:49:54.24105035 +0000 UTC m=+4762.920252158" watchObservedRunningTime="2025-11-26 14:49:54.243143315 +0000 UTC m=+4762.922345143" Nov 26 14:49:54 crc kubenswrapper[5200]: I1126 14:49:54.323853 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerName="rabbitmq" containerID="cri-o://698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5" gracePeriod=604799 Nov 26 14:49:54 crc kubenswrapper[5200]: I1126 14:49:54.821789 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerName="rabbitmq" containerID="cri-o://50dda4812a7654b4bc1fff6e9728feb007b1d7c24cdd75ff8483579ea69e3846" gracePeriod=604799 Nov 26 14:49:58 crc kubenswrapper[5200]: I1126 14:49:58.001427 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.238:5672: connect: connection refused" Nov 26 14:49:58 crc kubenswrapper[5200]: I1126 14:49:58.001464 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.239:5672: connect: connection refused" Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.234105 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.351056 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5798f4c845-cz7sk"] Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.351350 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" podUID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerName="dnsmasq-dns" containerID="cri-o://3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317" gracePeriod=10 Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.468304 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" podUID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.237:5353: connect: connection refused" Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.778482 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.933659 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh6fc\" (UniqueName: \"kubernetes.io/projected/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-kube-api-access-bh6fc\") pod \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.933786 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-dns-svc\") pod \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.933822 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-config\") pod \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\" (UID: \"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646\") " Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.941182 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-kube-api-access-bh6fc" (OuterVolumeSpecName: "kube-api-access-bh6fc") pod "7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" (UID: "7a1aef2e-6ca3-4516-aeee-fe6cda2c6646"). InnerVolumeSpecName "kube-api-access-bh6fc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.960069 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 14:50:00 crc kubenswrapper[5200]: I1126 14:50:00.984899 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" (UID: "7a1aef2e-6ca3-4516-aeee-fe6cda2c6646"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.011362 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-config" (OuterVolumeSpecName: "config") pod "7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" (UID: "7a1aef2e-6ca3-4516-aeee-fe6cda2c6646"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.038664 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87a0d855-31af-4e13-8461-4afd0f0edaaf-pod-info\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.039237 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.039278 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-erlang-cookie\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.039378 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87a0d855-31af-4e13-8461-4afd0f0edaaf-erlang-cookie-secret\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.039407 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-plugins\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.039500 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq9l6\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-kube-api-access-wq9l6\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.039527 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-confd\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.041240 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.042723 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.047569 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-kube-api-access-wq9l6" (OuterVolumeSpecName: "kube-api-access-wq9l6") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "kube-api-access-wq9l6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.050457 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-server-conf\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.050639 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-plugins-conf\") pod \"87a0d855-31af-4e13-8461-4afd0f0edaaf\" (UID: \"87a0d855-31af-4e13-8461-4afd0f0edaaf\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.051095 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.051403 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.051424 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.051434 5200 reconciler_common.go:299] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.051445 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.051454 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.051464 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bh6fc\" (UniqueName: \"kubernetes.io/projected/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646-kube-api-access-bh6fc\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.051473 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wq9l6\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-kube-api-access-wq9l6\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.053000 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a0d855-31af-4e13-8461-4afd0f0edaaf-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.059960 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/87a0d855-31af-4e13-8461-4afd0f0edaaf-pod-info" (OuterVolumeSpecName: "pod-info") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.061988 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8" (OuterVolumeSpecName: "persistence") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "pvc-7a9df237-c224-4227-a786-71b5a60b7af8". PluginName "kubernetes.io/csi", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.067938 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-server-conf" (OuterVolumeSpecName: "server-conf") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.143324 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "87a0d855-31af-4e13-8461-4afd0f0edaaf" (UID: "87a0d855-31af-4e13-8461-4afd0f0edaaf"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.153222 5200 reconciler_common.go:299] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/87a0d855-31af-4e13-8461-4afd0f0edaaf-server-conf\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.153432 5200 reconciler_common.go:299] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/87a0d855-31af-4e13-8461-4afd0f0edaaf-pod-info\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.153563 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") on node \"crc\" " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.153677 5200 reconciler_common.go:299] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/87a0d855-31af-4e13-8461-4afd0f0edaaf-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.153807 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/87a0d855-31af-4e13-8461-4afd0f0edaaf-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.170668 5200 csi_attacher.go:567] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.170863 5200 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-7a9df237-c224-4227-a786-71b5a60b7af8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8") on node "crc" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.266466 5200 reconciler_common.go:299] "Volume detached for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.327171 5200 generic.go:358] "Generic (PLEG): container finished" podID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerID="3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317" exitCode=0 Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.327393 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" event={"ID":"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646","Type":"ContainerDied","Data":"3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317"} Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.327436 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" event={"ID":"7a1aef2e-6ca3-4516-aeee-fe6cda2c6646","Type":"ContainerDied","Data":"889d76ab7b46906fc4f692a646f8c849ae1815f2d4fb5941195a49ece4306a06"} Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.327458 5200 scope.go:117] "RemoveContainer" containerID="3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.327633 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5798f4c845-cz7sk" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.333008 5200 generic.go:358] "Generic (PLEG): container finished" podID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerID="50dda4812a7654b4bc1fff6e9728feb007b1d7c24cdd75ff8483579ea69e3846" exitCode=0 Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.333115 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ee40693c-087a-49b3-9a84-cbc50c66b21e","Type":"ContainerDied","Data":"50dda4812a7654b4bc1fff6e9728feb007b1d7c24cdd75ff8483579ea69e3846"} Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.335409 5200 generic.go:358] "Generic (PLEG): container finished" podID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerID="698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5" exitCode=0 Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.335503 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87a0d855-31af-4e13-8461-4afd0f0edaaf","Type":"ContainerDied","Data":"698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5"} Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.335529 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"87a0d855-31af-4e13-8461-4afd0f0edaaf","Type":"ContainerDied","Data":"b7fd13a310767f3a6b0caac7371cc18188a3a0d13e8fb7de0ce77ebe59d5bc15"} Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.335850 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.361175 5200 scope.go:117] "RemoveContainer" containerID="ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.366341 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5798f4c845-cz7sk"] Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.366658 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.377846 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5798f4c845-cz7sk"] Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.431977 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.443259 5200 scope.go:117] "RemoveContainer" containerID="3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317" Nov 26 14:50:01 crc kubenswrapper[5200]: E1126 14:50:01.450495 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317\": container with ID starting with 3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317 not found: ID does not exist" containerID="3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.450548 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317"} err="failed to get container status \"3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317\": rpc error: code = NotFound desc = could not find container \"3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317\": container with ID starting with 3ea3a93dd605b61c35e3fc668aef642f3371e368632df92731ccad5678ef1317 not found: ID does not exist" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.450574 5200 scope.go:117] "RemoveContainer" containerID="ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.467931 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473295 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-erlang-cookie\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473360 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-confd\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473524 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473571 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-plugins\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473652 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-plugins-conf\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473703 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-server-conf\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473749 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee40693c-087a-49b3-9a84-cbc50c66b21e-erlang-cookie-secret\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473777 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf4sc\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-kube-api-access-lf4sc\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473837 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee40693c-087a-49b3-9a84-cbc50c66b21e-pod-info\") pod \"ee40693c-087a-49b3-9a84-cbc50c66b21e\" (UID: \"ee40693c-087a-49b3-9a84-cbc50c66b21e\") " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.473999 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.474094 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.474841 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.474864 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.479629 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480828 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerName="dnsmasq-dns" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480841 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerName="dnsmasq-dns" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480874 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerName="setup-container" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480881 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerName="setup-container" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480895 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerName="rabbitmq" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480900 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerName="rabbitmq" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480920 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerName="rabbitmq" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480925 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerName="rabbitmq" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480943 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerName="setup-container" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480951 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerName="setup-container" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480959 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerName="init" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.480965 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerName="init" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.481089 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" containerName="dnsmasq-dns" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.481096 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" containerName="rabbitmq" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.481113 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" containerName="rabbitmq" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.488357 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.494550 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ee40693c-087a-49b3-9a84-cbc50c66b21e-pod-info" (OuterVolumeSpecName: "pod-info") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.494597 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee40693c-087a-49b3-9a84-cbc50c66b21e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.494671 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-kube-api-access-lf4sc" (OuterVolumeSpecName: "kube-api-access-lf4sc") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "kube-api-access-lf4sc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: E1126 14:50:01.494806 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf\": container with ID starting with ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf not found: ID does not exist" containerID="ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.502831 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf"} err="failed to get container status \"ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf\": rpc error: code = NotFound desc = could not find container \"ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf\": container with ID starting with ccbbd64c0bec2de04ddcd869ddb56ce3920e0d035809e681d655517cefd8d8cf not found: ID does not exist" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.502870 5200 scope.go:117] "RemoveContainer" containerID="698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.495055 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-server-conf\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.495079 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-server-dockercfg-68vn9\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.495142 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-erlang-cookie\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.495247 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-default-user\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.495098 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-plugins-conf\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.506089 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.509304 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559" (OuterVolumeSpecName: "persistence") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559". PluginName "kubernetes.io/csi", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.520331 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-server-conf" (OuterVolumeSpecName: "server-conf") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.573908 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ee40693c-087a-49b3-9a84-cbc50c66b21e" (UID: "ee40693c-087a-49b3-9a84-cbc50c66b21e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.575728 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c6740ee-f526-4bb4-b6c9-ab3304b62709-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.575821 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.575856 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btx84\" (UniqueName: \"kubernetes.io/projected/5c6740ee-f526-4bb4-b6c9-ab3304b62709-kube-api-access-btx84\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.575966 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c6740ee-f526-4bb4-b6c9-ab3304b62709-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.576001 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c6740ee-f526-4bb4-b6c9-ab3304b62709-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.576149 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.576199 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c6740ee-f526-4bb4-b6c9-ab3304b62709-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.576244 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.576312 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.577602 5200 reconciler_common.go:299] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee40693c-087a-49b3-9a84-cbc50c66b21e-pod-info\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.577625 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.577651 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") on node \"crc\" " Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.577663 5200 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee40693c-087a-49b3-9a84-cbc50c66b21e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.577675 5200 reconciler_common.go:299] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.577685 5200 reconciler_common.go:299] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee40693c-087a-49b3-9a84-cbc50c66b21e-server-conf\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.577744 5200 reconciler_common.go:299] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee40693c-087a-49b3-9a84-cbc50c66b21e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.577755 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lf4sc\" (UniqueName: \"kubernetes.io/projected/ee40693c-087a-49b3-9a84-cbc50c66b21e-kube-api-access-lf4sc\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.593045 5200 csi_attacher.go:567] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.593193 5200 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559") on node "crc" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.618588 5200 scope.go:117] "RemoveContainer" containerID="6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.649668 5200 scope.go:117] "RemoveContainer" containerID="698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5" Nov 26 14:50:01 crc kubenswrapper[5200]: E1126 14:50:01.650466 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5\": container with ID starting with 698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5 not found: ID does not exist" containerID="698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.650514 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5"} err="failed to get container status \"698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5\": rpc error: code = NotFound desc = could not find container \"698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5\": container with ID starting with 698942278da56d78ee3a3f5353ae8480a54d39576a702aa2b955612baf5604f5 not found: ID does not exist" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.650536 5200 scope.go:117] "RemoveContainer" containerID="6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a" Nov 26 14:50:01 crc kubenswrapper[5200]: E1126 14:50:01.650803 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a\": container with ID starting with 6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a not found: ID does not exist" containerID="6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.650820 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a"} err="failed to get container status \"6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a\": rpc error: code = NotFound desc = could not find container \"6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a\": container with ID starting with 6f31ee32530470b7b6602022cbf2cbfecac37f30fc57e8bef9c99393a4864e6a not found: ID does not exist" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679071 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c6740ee-f526-4bb4-b6c9-ab3304b62709-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679146 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679180 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c6740ee-f526-4bb4-b6c9-ab3304b62709-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679199 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679223 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679250 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c6740ee-f526-4bb4-b6c9-ab3304b62709-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679264 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679285 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btx84\" (UniqueName: \"kubernetes.io/projected/5c6740ee-f526-4bb4-b6c9-ab3304b62709-kube-api-access-btx84\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679324 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c6740ee-f526-4bb4-b6c9-ab3304b62709-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.679397 5200 reconciler_common.go:299] "Volume detached for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.680222 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.680459 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.680614 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c6740ee-f526-4bb4-b6c9-ab3304b62709-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.681003 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c6740ee-f526-4bb4-b6c9-ab3304b62709-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.682138 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.682167 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2f1017e1977e01fab334d49bcc5d7405af4398edd347a7a80d8f13ccf4900573/globalmount\"" pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.684364 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c6740ee-f526-4bb4-b6c9-ab3304b62709-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.685008 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c6740ee-f526-4bb4-b6c9-ab3304b62709-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.685869 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c6740ee-f526-4bb4-b6c9-ab3304b62709-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.700172 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btx84\" (UniqueName: \"kubernetes.io/projected/5c6740ee-f526-4bb4-b6c9-ab3304b62709-kube-api-access-btx84\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.707133 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-7a9df237-c224-4227-a786-71b5a60b7af8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a9df237-c224-4227-a786-71b5a60b7af8\") pod \"rabbitmq-server-0\" (UID: \"5c6740ee-f526-4bb4-b6c9-ab3304b62709\") " pod="openstack/rabbitmq-server-0" Nov 26 14:50:01 crc kubenswrapper[5200]: I1126 14:50:01.933627 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.346738 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ee40693c-087a-49b3-9a84-cbc50c66b21e","Type":"ContainerDied","Data":"a26abf90785bbf6e7958a75fbb7ce3201e144b8a0cdf5a93de6d94aae220434e"} Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.346784 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.346794 5200 scope.go:117] "RemoveContainer" containerID="50dda4812a7654b4bc1fff6e9728feb007b1d7c24cdd75ff8483579ea69e3846" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.367247 5200 scope.go:117] "RemoveContainer" containerID="de09db6c53e17347b897453ad34d13b27a28189c2eccae635c6f6f8add67f68d" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.392889 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.404725 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.413414 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.423341 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.443906 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.444104 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.448409 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-erlang-cookie\"" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.448927 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-conf\"" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.452955 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-default-user\"" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.454108 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-dockercfg-lqkrx\"" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.454287 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-plugins-conf\"" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495183 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1dec395e-bd7f-4021-a151-5a86dd782ee2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495222 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1dec395e-bd7f-4021-a151-5a86dd782ee2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495341 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495454 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495473 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1dec395e-bd7f-4021-a151-5a86dd782ee2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495566 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlf2q\" (UniqueName: \"kubernetes.io/projected/1dec395e-bd7f-4021-a151-5a86dd782ee2-kube-api-access-jlf2q\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495590 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495659 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1dec395e-bd7f-4021-a151-5a86dd782ee2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.495696 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.597334 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1dec395e-bd7f-4021-a151-5a86dd782ee2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.597388 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1dec395e-bd7f-4021-a151-5a86dd782ee2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.598157 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.598342 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.598715 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1dec395e-bd7f-4021-a151-5a86dd782ee2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.598782 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.598839 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1dec395e-bd7f-4021-a151-5a86dd782ee2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.599055 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlf2q\" (UniqueName: \"kubernetes.io/projected/1dec395e-bd7f-4021-a151-5a86dd782ee2-kube-api-access-jlf2q\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.599915 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1dec395e-bd7f-4021-a151-5a86dd782ee2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.600755 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.600881 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1dec395e-bd7f-4021-a151-5a86dd782ee2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.600918 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1dec395e-bd7f-4021-a151-5a86dd782ee2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.600949 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.601549 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.601873 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b3e74fd8d458bb9aaf3401e4a41a44475872a1d27ea67bd524df4aa3945308e1/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.601551 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.608407 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1dec395e-bd7f-4021-a151-5a86dd782ee2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.608648 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1dec395e-bd7f-4021-a151-5a86dd782ee2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.621994 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlf2q\" (UniqueName: \"kubernetes.io/projected/1dec395e-bd7f-4021-a151-5a86dd782ee2-kube-api-access-jlf2q\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.640490 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a52084a-836c-4b1e-bfcd-43cdfef56559\") pod \"rabbitmq-cell1-server-0\" (UID: \"1dec395e-bd7f-4021-a151-5a86dd782ee2\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:02 crc kubenswrapper[5200]: I1126 14:50:02.843485 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:03 crc kubenswrapper[5200]: I1126 14:50:03.000548 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1aef2e-6ca3-4516-aeee-fe6cda2c6646" path="/var/lib/kubelet/pods/7a1aef2e-6ca3-4516-aeee-fe6cda2c6646/volumes" Nov 26 14:50:03 crc kubenswrapper[5200]: I1126 14:50:03.001855 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a0d855-31af-4e13-8461-4afd0f0edaaf" path="/var/lib/kubelet/pods/87a0d855-31af-4e13-8461-4afd0f0edaaf/volumes" Nov 26 14:50:03 crc kubenswrapper[5200]: I1126 14:50:03.003215 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee40693c-087a-49b3-9a84-cbc50c66b21e" path="/var/lib/kubelet/pods/ee40693c-087a-49b3-9a84-cbc50c66b21e/volumes" Nov 26 14:50:03 crc kubenswrapper[5200]: I1126 14:50:03.340126 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 14:50:03 crc kubenswrapper[5200]: W1126 14:50:03.349025 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dec395e_bd7f_4021_a151_5a86dd782ee2.slice/crio-13a409ab9c91ba7f235a958754821f3f47da3892f66841aa11529ffae594d16e WatchSource:0}: Error finding container 13a409ab9c91ba7f235a958754821f3f47da3892f66841aa11529ffae594d16e: Status 404 returned error can't find the container with id 13a409ab9c91ba7f235a958754821f3f47da3892f66841aa11529ffae594d16e Nov 26 14:50:03 crc kubenswrapper[5200]: I1126 14:50:03.368129 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c6740ee-f526-4bb4-b6c9-ab3304b62709","Type":"ContainerStarted","Data":"41192c6b53e461ab62fc1ee363aadc4607b2540776397535c365f8a4e678e275"} Nov 26 14:50:03 crc kubenswrapper[5200]: I1126 14:50:03.370391 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1dec395e-bd7f-4021-a151-5a86dd782ee2","Type":"ContainerStarted","Data":"13a409ab9c91ba7f235a958754821f3f47da3892f66841aa11529ffae594d16e"} Nov 26 14:50:04 crc kubenswrapper[5200]: I1126 14:50:04.387800 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c6740ee-f526-4bb4-b6c9-ab3304b62709","Type":"ContainerStarted","Data":"b781c13576a10430420ad53f0e8298708a0e1ed33f87b702467eeee4fee959d4"} Nov 26 14:50:05 crc kubenswrapper[5200]: I1126 14:50:05.405647 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1dec395e-bd7f-4021-a151-5a86dd782ee2","Type":"ContainerStarted","Data":"524c1dd81cf114387157993e77427133c47d55bc7584c1f6be769f13c752bbe2"} Nov 26 14:50:36 crc kubenswrapper[5200]: I1126 14:50:36.785099 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:50:36 crc kubenswrapper[5200]: I1126 14:50:36.788420 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:50:36 crc kubenswrapper[5200]: I1126 14:50:36.790933 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:50:36 crc kubenswrapper[5200]: I1126 14:50:36.793346 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:50:37 crc kubenswrapper[5200]: I1126 14:50:37.694026 5200 generic.go:358] "Generic (PLEG): container finished" podID="1dec395e-bd7f-4021-a151-5a86dd782ee2" containerID="524c1dd81cf114387157993e77427133c47d55bc7584c1f6be769f13c752bbe2" exitCode=0 Nov 26 14:50:37 crc kubenswrapper[5200]: I1126 14:50:37.694122 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1dec395e-bd7f-4021-a151-5a86dd782ee2","Type":"ContainerDied","Data":"524c1dd81cf114387157993e77427133c47d55bc7584c1f6be769f13c752bbe2"} Nov 26 14:50:37 crc kubenswrapper[5200]: I1126 14:50:37.696406 5200 generic.go:358] "Generic (PLEG): container finished" podID="5c6740ee-f526-4bb4-b6c9-ab3304b62709" containerID="b781c13576a10430420ad53f0e8298708a0e1ed33f87b702467eeee4fee959d4" exitCode=0 Nov 26 14:50:37 crc kubenswrapper[5200]: I1126 14:50:37.696465 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c6740ee-f526-4bb4-b6c9-ab3304b62709","Type":"ContainerDied","Data":"b781c13576a10430420ad53f0e8298708a0e1ed33f87b702467eeee4fee959d4"} Nov 26 14:50:38 crc kubenswrapper[5200]: I1126 14:50:38.707301 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5c6740ee-f526-4bb4-b6c9-ab3304b62709","Type":"ContainerStarted","Data":"b76292d305aac27fd5c4ef28aac27260134ae28e8b423daf485fe41686de9f00"} Nov 26 14:50:38 crc kubenswrapper[5200]: I1126 14:50:38.708252 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-server-0" Nov 26 14:50:38 crc kubenswrapper[5200]: I1126 14:50:38.710175 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1dec395e-bd7f-4021-a151-5a86dd782ee2","Type":"ContainerStarted","Data":"85d1b486f3727c355101ab24f9045afa9b30cace3c30627261d3850c48be5223"} Nov 26 14:50:38 crc kubenswrapper[5200]: I1126 14:50:38.710483 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:38 crc kubenswrapper[5200]: I1126 14:50:38.734620 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.734593102 podStartE2EDuration="37.734593102s" podCreationTimestamp="2025-11-26 14:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:50:38.733811472 +0000 UTC m=+4807.413013330" watchObservedRunningTime="2025-11-26 14:50:38.734593102 +0000 UTC m=+4807.413794920" Nov 26 14:50:38 crc kubenswrapper[5200]: I1126 14:50:38.759832 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.759786817 podStartE2EDuration="36.759786817s" podCreationTimestamp="2025-11-26 14:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:50:38.753881451 +0000 UTC m=+4807.433083299" watchObservedRunningTime="2025-11-26 14:50:38.759786817 +0000 UTC m=+4807.438988635" Nov 26 14:50:41 crc kubenswrapper[5200]: E1126 14:50:41.759265 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1304225 actualBytes=10240 Nov 26 14:50:49 crc kubenswrapper[5200]: I1126 14:50:49.720411 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 26 14:50:49 crc kubenswrapper[5200]: I1126 14:50:49.721746 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 14:50:59 crc kubenswrapper[5200]: I1126 14:50:59.288546 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Nov 26 14:50:59 crc kubenswrapper[5200]: I1126 14:50:59.300892 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 26 14:50:59 crc kubenswrapper[5200]: I1126 14:50:59.302481 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 26 14:50:59 crc kubenswrapper[5200]: I1126 14:50:59.304820 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"default-dockercfg-8vwh2\"" Nov 26 14:50:59 crc kubenswrapper[5200]: I1126 14:50:59.319600 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kgfh\" (UniqueName: \"kubernetes.io/projected/1b4bcc26-cd8d-4075-a007-07c49b0b28ab-kube-api-access-7kgfh\") pod \"mariadb-client-1-default\" (UID: \"1b4bcc26-cd8d-4075-a007-07c49b0b28ab\") " pod="openstack/mariadb-client-1-default" Nov 26 14:50:59 crc kubenswrapper[5200]: I1126 14:50:59.422895 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kgfh\" (UniqueName: \"kubernetes.io/projected/1b4bcc26-cd8d-4075-a007-07c49b0b28ab-kube-api-access-7kgfh\") pod \"mariadb-client-1-default\" (UID: \"1b4bcc26-cd8d-4075-a007-07c49b0b28ab\") " pod="openstack/mariadb-client-1-default" Nov 26 14:50:59 crc kubenswrapper[5200]: I1126 14:50:59.470236 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kgfh\" (UniqueName: \"kubernetes.io/projected/1b4bcc26-cd8d-4075-a007-07c49b0b28ab-kube-api-access-7kgfh\") pod \"mariadb-client-1-default\" (UID: \"1b4bcc26-cd8d-4075-a007-07c49b0b28ab\") " pod="openstack/mariadb-client-1-default" Nov 26 14:50:59 crc kubenswrapper[5200]: I1126 14:50:59.634211 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 26 14:51:00 crc kubenswrapper[5200]: I1126 14:51:00.215072 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 26 14:51:00 crc kubenswrapper[5200]: W1126 14:51:00.223405 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b4bcc26_cd8d_4075_a007_07c49b0b28ab.slice/crio-ac232d2baa34f7f6dcfda7484df5de1add12d985df6c3104de81811082c39412 WatchSource:0}: Error finding container ac232d2baa34f7f6dcfda7484df5de1add12d985df6c3104de81811082c39412: Status 404 returned error can't find the container with id ac232d2baa34f7f6dcfda7484df5de1add12d985df6c3104de81811082c39412 Nov 26 14:51:00 crc kubenswrapper[5200]: I1126 14:51:00.927832 5200 generic.go:358] "Generic (PLEG): container finished" podID="1b4bcc26-cd8d-4075-a007-07c49b0b28ab" containerID="24d158d9d5b459b24e547ae107f017e6a91f995ad88046d2533da12919bb0fce" exitCode=0 Nov 26 14:51:00 crc kubenswrapper[5200]: I1126 14:51:00.928445 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"1b4bcc26-cd8d-4075-a007-07c49b0b28ab","Type":"ContainerDied","Data":"24d158d9d5b459b24e547ae107f017e6a91f995ad88046d2533da12919bb0fce"} Nov 26 14:51:00 crc kubenswrapper[5200]: I1126 14:51:00.928504 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"1b4bcc26-cd8d-4075-a007-07c49b0b28ab","Type":"ContainerStarted","Data":"ac232d2baa34f7f6dcfda7484df5de1add12d985df6c3104de81811082c39412"} Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.406303 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.435640 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_1b4bcc26-cd8d-4075-a007-07c49b0b28ab/mariadb-client-1-default/0.log" Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.467549 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.478555 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.478900 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kgfh\" (UniqueName: \"kubernetes.io/projected/1b4bcc26-cd8d-4075-a007-07c49b0b28ab-kube-api-access-7kgfh\") pod \"1b4bcc26-cd8d-4075-a007-07c49b0b28ab\" (UID: \"1b4bcc26-cd8d-4075-a007-07c49b0b28ab\") " Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.490556 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4bcc26-cd8d-4075-a007-07c49b0b28ab-kube-api-access-7kgfh" (OuterVolumeSpecName: "kube-api-access-7kgfh") pod "1b4bcc26-cd8d-4075-a007-07c49b0b28ab" (UID: "1b4bcc26-cd8d-4075-a007-07c49b0b28ab"). InnerVolumeSpecName "kube-api-access-7kgfh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.581639 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7kgfh\" (UniqueName: \"kubernetes.io/projected/1b4bcc26-cd8d-4075-a007-07c49b0b28ab-kube-api-access-7kgfh\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.914897 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.917628 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b4bcc26-cd8d-4075-a007-07c49b0b28ab" containerName="mariadb-client-1-default" Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.917749 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4bcc26-cd8d-4075-a007-07c49b0b28ab" containerName="mariadb-client-1-default" Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.918205 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b4bcc26-cd8d-4075-a007-07c49b0b28ab" containerName="mariadb-client-1-default" Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.927296 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.927560 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 26 14:51:02 crc kubenswrapper[5200]: I1126 14:51:02.993676 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ldrl\" (UniqueName: \"kubernetes.io/projected/fbe49154-9e97-4fb7-9610-940082b28e77-kube-api-access-4ldrl\") pod \"mariadb-client-2-default\" (UID: \"fbe49154-9e97-4fb7-9610-940082b28e77\") " pod="openstack/mariadb-client-2-default" Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.013104 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.014033 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b4bcc26-cd8d-4075-a007-07c49b0b28ab" path="/var/lib/kubelet/pods/1b4bcc26-cd8d-4075-a007-07c49b0b28ab/volumes" Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.015217 5200 scope.go:117] "RemoveContainer" containerID="24d158d9d5b459b24e547ae107f017e6a91f995ad88046d2533da12919bb0fce" Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.095953 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ldrl\" (UniqueName: \"kubernetes.io/projected/fbe49154-9e97-4fb7-9610-940082b28e77-kube-api-access-4ldrl\") pod \"mariadb-client-2-default\" (UID: \"fbe49154-9e97-4fb7-9610-940082b28e77\") " pod="openstack/mariadb-client-2-default" Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.113902 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ldrl\" (UniqueName: \"kubernetes.io/projected/fbe49154-9e97-4fb7-9610-940082b28e77-kube-api-access-4ldrl\") pod \"mariadb-client-2-default\" (UID: \"fbe49154-9e97-4fb7-9610-940082b28e77\") " pod="openstack/mariadb-client-2-default" Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.155297 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.155424 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.295517 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.874142 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 26 14:51:03 crc kubenswrapper[5200]: W1126 14:51:03.877255 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbe49154_9e97_4fb7_9610_940082b28e77.slice/crio-28dac855e1564c8b38d397c968f0588cbfc7f7d11d23078c4b3c48e4477159fe WatchSource:0}: Error finding container 28dac855e1564c8b38d397c968f0588cbfc7f7d11d23078c4b3c48e4477159fe: Status 404 returned error can't find the container with id 28dac855e1564c8b38d397c968f0588cbfc7f7d11d23078c4b3c48e4477159fe Nov 26 14:51:03 crc kubenswrapper[5200]: I1126 14:51:03.986927 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"fbe49154-9e97-4fb7-9610-940082b28e77","Type":"ContainerStarted","Data":"28dac855e1564c8b38d397c968f0588cbfc7f7d11d23078c4b3c48e4477159fe"} Nov 26 14:51:05 crc kubenswrapper[5200]: I1126 14:51:05.001420 5200 generic.go:358] "Generic (PLEG): container finished" podID="fbe49154-9e97-4fb7-9610-940082b28e77" containerID="5a648c7b7c9d79bd41da9f666e73281f018d7868772076f12ba79624f2640966" exitCode=1 Nov 26 14:51:05 crc kubenswrapper[5200]: I1126 14:51:05.001563 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"fbe49154-9e97-4fb7-9610-940082b28e77","Type":"ContainerDied","Data":"5a648c7b7c9d79bd41da9f666e73281f018d7868772076f12ba79624f2640966"} Nov 26 14:51:06 crc kubenswrapper[5200]: I1126 14:51:06.474021 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 26 14:51:06 crc kubenswrapper[5200]: I1126 14:51:06.494209 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_fbe49154-9e97-4fb7-9610-940082b28e77/mariadb-client-2-default/0.log" Nov 26 14:51:06 crc kubenswrapper[5200]: I1126 14:51:06.525564 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 26 14:51:06 crc kubenswrapper[5200]: I1126 14:51:06.532493 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 26 14:51:06 crc kubenswrapper[5200]: I1126 14:51:06.566499 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ldrl\" (UniqueName: \"kubernetes.io/projected/fbe49154-9e97-4fb7-9610-940082b28e77-kube-api-access-4ldrl\") pod \"fbe49154-9e97-4fb7-9610-940082b28e77\" (UID: \"fbe49154-9e97-4fb7-9610-940082b28e77\") " Nov 26 14:51:06 crc kubenswrapper[5200]: I1126 14:51:06.572393 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe49154-9e97-4fb7-9610-940082b28e77-kube-api-access-4ldrl" (OuterVolumeSpecName: "kube-api-access-4ldrl") pod "fbe49154-9e97-4fb7-9610-940082b28e77" (UID: "fbe49154-9e97-4fb7-9610-940082b28e77"). InnerVolumeSpecName "kube-api-access-4ldrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:51:06 crc kubenswrapper[5200]: I1126 14:51:06.668941 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4ldrl\" (UniqueName: \"kubernetes.io/projected/fbe49154-9e97-4fb7-9610-940082b28e77-kube-api-access-4ldrl\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:06 crc kubenswrapper[5200]: I1126 14:51:06.988000 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe49154-9e97-4fb7-9610-940082b28e77" path="/var/lib/kubelet/pods/fbe49154-9e97-4fb7-9610-940082b28e77/volumes" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.039018 5200 scope.go:117] "RemoveContainer" containerID="5a648c7b7c9d79bd41da9f666e73281f018d7868772076f12ba79624f2640966" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.039293 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.044273 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.045890 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbe49154-9e97-4fb7-9610-940082b28e77" containerName="mariadb-client-2-default" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.045918 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe49154-9e97-4fb7-9610-940082b28e77" containerName="mariadb-client-2-default" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.046073 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbe49154-9e97-4fb7-9610-940082b28e77" containerName="mariadb-client-2-default" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.100316 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.100508 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.104116 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"default-dockercfg-8vwh2\"" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.182046 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bvh7\" (UniqueName: \"kubernetes.io/projected/0e2f5690-9ebb-449a-9307-9206c55ff191-kube-api-access-4bvh7\") pod \"mariadb-client-1\" (UID: \"0e2f5690-9ebb-449a-9307-9206c55ff191\") " pod="openstack/mariadb-client-1" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.283801 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bvh7\" (UniqueName: \"kubernetes.io/projected/0e2f5690-9ebb-449a-9307-9206c55ff191-kube-api-access-4bvh7\") pod \"mariadb-client-1\" (UID: \"0e2f5690-9ebb-449a-9307-9206c55ff191\") " pod="openstack/mariadb-client-1" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.320155 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bvh7\" (UniqueName: \"kubernetes.io/projected/0e2f5690-9ebb-449a-9307-9206c55ff191-kube-api-access-4bvh7\") pod \"mariadb-client-1\" (UID: \"0e2f5690-9ebb-449a-9307-9206c55ff191\") " pod="openstack/mariadb-client-1" Nov 26 14:51:07 crc kubenswrapper[5200]: I1126 14:51:07.430320 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 26 14:51:08 crc kubenswrapper[5200]: I1126 14:51:08.092014 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Nov 26 14:51:09 crc kubenswrapper[5200]: I1126 14:51:09.060596 5200 generic.go:358] "Generic (PLEG): container finished" podID="0e2f5690-9ebb-449a-9307-9206c55ff191" containerID="bda545e3288bdc23229c491d5213c1316c486f02a293d9e55b4fbbe040b144cc" exitCode=0 Nov 26 14:51:09 crc kubenswrapper[5200]: I1126 14:51:09.060754 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"0e2f5690-9ebb-449a-9307-9206c55ff191","Type":"ContainerDied","Data":"bda545e3288bdc23229c491d5213c1316c486f02a293d9e55b4fbbe040b144cc"} Nov 26 14:51:09 crc kubenswrapper[5200]: I1126 14:51:09.060840 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"0e2f5690-9ebb-449a-9307-9206c55ff191","Type":"ContainerStarted","Data":"c9b60f3a3409b2c65b7ffdc4764fe54747eedf5cd86bcdac3bd81514c9efea2c"} Nov 26 14:51:10 crc kubenswrapper[5200]: I1126 14:51:10.513492 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 26 14:51:10 crc kubenswrapper[5200]: I1126 14:51:10.532822 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_0e2f5690-9ebb-449a-9307-9206c55ff191/mariadb-client-1/0.log" Nov 26 14:51:10 crc kubenswrapper[5200]: I1126 14:51:10.562163 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Nov 26 14:51:10 crc kubenswrapper[5200]: I1126 14:51:10.568605 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bvh7\" (UniqueName: \"kubernetes.io/projected/0e2f5690-9ebb-449a-9307-9206c55ff191-kube-api-access-4bvh7\") pod \"0e2f5690-9ebb-449a-9307-9206c55ff191\" (UID: \"0e2f5690-9ebb-449a-9307-9206c55ff191\") " Nov 26 14:51:10 crc kubenswrapper[5200]: I1126 14:51:10.569366 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Nov 26 14:51:10 crc kubenswrapper[5200]: I1126 14:51:10.579045 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e2f5690-9ebb-449a-9307-9206c55ff191-kube-api-access-4bvh7" (OuterVolumeSpecName: "kube-api-access-4bvh7") pod "0e2f5690-9ebb-449a-9307-9206c55ff191" (UID: "0e2f5690-9ebb-449a-9307-9206c55ff191"). InnerVolumeSpecName "kube-api-access-4bvh7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:51:10 crc kubenswrapper[5200]: I1126 14:51:10.671156 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4bvh7\" (UniqueName: \"kubernetes.io/projected/0e2f5690-9ebb-449a-9307-9206c55ff191-kube-api-access-4bvh7\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:10 crc kubenswrapper[5200]: I1126 14:51:10.987520 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e2f5690-9ebb-449a-9307-9206c55ff191" path="/var/lib/kubelet/pods/0e2f5690-9ebb-449a-9307-9206c55ff191/volumes" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.013804 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.017004 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e2f5690-9ebb-449a-9307-9206c55ff191" containerName="mariadb-client-1" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.017059 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2f5690-9ebb-449a-9307-9206c55ff191" containerName="mariadb-client-1" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.017522 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e2f5690-9ebb-449a-9307-9206c55ff191" containerName="mariadb-client-1" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.027636 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.027874 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.080495 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtkpr\" (UniqueName: \"kubernetes.io/projected/5fc163e7-8821-4b4f-8d2d-9d3ce797c59b-kube-api-access-qtkpr\") pod \"mariadb-client-4-default\" (UID: \"5fc163e7-8821-4b4f-8d2d-9d3ce797c59b\") " pod="openstack/mariadb-client-4-default" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.081804 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.081873 5200 scope.go:117] "RemoveContainer" containerID="bda545e3288bdc23229c491d5213c1316c486f02a293d9e55b4fbbe040b144cc" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.183230 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtkpr\" (UniqueName: \"kubernetes.io/projected/5fc163e7-8821-4b4f-8d2d-9d3ce797c59b-kube-api-access-qtkpr\") pod \"mariadb-client-4-default\" (UID: \"5fc163e7-8821-4b4f-8d2d-9d3ce797c59b\") " pod="openstack/mariadb-client-4-default" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.205100 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtkpr\" (UniqueName: \"kubernetes.io/projected/5fc163e7-8821-4b4f-8d2d-9d3ce797c59b-kube-api-access-qtkpr\") pod \"mariadb-client-4-default\" (UID: \"5fc163e7-8821-4b4f-8d2d-9d3ce797c59b\") " pod="openstack/mariadb-client-4-default" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.351522 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 26 14:51:11 crc kubenswrapper[5200]: I1126 14:51:11.903791 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 26 14:51:12 crc kubenswrapper[5200]: I1126 14:51:12.090067 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"5fc163e7-8821-4b4f-8d2d-9d3ce797c59b","Type":"ContainerStarted","Data":"b2ff8e44991707b3f65ae2c7da1faf0c3a663588a6a8f40e6f0b16a9881ecb61"} Nov 26 14:51:13 crc kubenswrapper[5200]: I1126 14:51:13.107275 5200 generic.go:358] "Generic (PLEG): container finished" podID="5fc163e7-8821-4b4f-8d2d-9d3ce797c59b" containerID="cd1df5e5afe807b77bbfc841d3f7fa5a2ebb64d133bc99b327cf652e9ca33b5b" exitCode=0 Nov 26 14:51:13 crc kubenswrapper[5200]: I1126 14:51:13.107354 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"5fc163e7-8821-4b4f-8d2d-9d3ce797c59b","Type":"ContainerDied","Data":"cd1df5e5afe807b77bbfc841d3f7fa5a2ebb64d133bc99b327cf652e9ca33b5b"} Nov 26 14:51:14 crc kubenswrapper[5200]: I1126 14:51:14.627151 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 26 14:51:14 crc kubenswrapper[5200]: I1126 14:51:14.657126 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_5fc163e7-8821-4b4f-8d2d-9d3ce797c59b/mariadb-client-4-default/0.log" Nov 26 14:51:14 crc kubenswrapper[5200]: I1126 14:51:14.687858 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 26 14:51:14 crc kubenswrapper[5200]: I1126 14:51:14.690439 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 26 14:51:14 crc kubenswrapper[5200]: I1126 14:51:14.754833 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtkpr\" (UniqueName: \"kubernetes.io/projected/5fc163e7-8821-4b4f-8d2d-9d3ce797c59b-kube-api-access-qtkpr\") pod \"5fc163e7-8821-4b4f-8d2d-9d3ce797c59b\" (UID: \"5fc163e7-8821-4b4f-8d2d-9d3ce797c59b\") " Nov 26 14:51:14 crc kubenswrapper[5200]: I1126 14:51:14.772566 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc163e7-8821-4b4f-8d2d-9d3ce797c59b-kube-api-access-qtkpr" (OuterVolumeSpecName: "kube-api-access-qtkpr") pod "5fc163e7-8821-4b4f-8d2d-9d3ce797c59b" (UID: "5fc163e7-8821-4b4f-8d2d-9d3ce797c59b"). InnerVolumeSpecName "kube-api-access-qtkpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:51:14 crc kubenswrapper[5200]: I1126 14:51:14.857179 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtkpr\" (UniqueName: \"kubernetes.io/projected/5fc163e7-8821-4b4f-8d2d-9d3ce797c59b-kube-api-access-qtkpr\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:14 crc kubenswrapper[5200]: I1126 14:51:14.988121 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc163e7-8821-4b4f-8d2d-9d3ce797c59b" path="/var/lib/kubelet/pods/5fc163e7-8821-4b4f-8d2d-9d3ce797c59b/volumes" Nov 26 14:51:15 crc kubenswrapper[5200]: I1126 14:51:15.129670 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 26 14:51:15 crc kubenswrapper[5200]: I1126 14:51:15.129859 5200 scope.go:117] "RemoveContainer" containerID="cd1df5e5afe807b77bbfc841d3f7fa5a2ebb64d133bc99b327cf652e9ca33b5b" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.077993 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.080988 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fc163e7-8821-4b4f-8d2d-9d3ce797c59b" containerName="mariadb-client-4-default" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.081014 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc163e7-8821-4b4f-8d2d-9d3ce797c59b" containerName="mariadb-client-4-default" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.081218 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5fc163e7-8821-4b4f-8d2d-9d3ce797c59b" containerName="mariadb-client-4-default" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.088464 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.091599 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"default-dockercfg-8vwh2\"" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.092286 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.249911 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptllj\" (UniqueName: \"kubernetes.io/projected/1b9ed389-bbc9-4d60-a52f-f31f3ce55a72-kube-api-access-ptllj\") pod \"mariadb-client-5-default\" (UID: \"1b9ed389-bbc9-4d60-a52f-f31f3ce55a72\") " pod="openstack/mariadb-client-5-default" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.352769 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptllj\" (UniqueName: \"kubernetes.io/projected/1b9ed389-bbc9-4d60-a52f-f31f3ce55a72-kube-api-access-ptllj\") pod \"mariadb-client-5-default\" (UID: \"1b9ed389-bbc9-4d60-a52f-f31f3ce55a72\") " pod="openstack/mariadb-client-5-default" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.380347 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptllj\" (UniqueName: \"kubernetes.io/projected/1b9ed389-bbc9-4d60-a52f-f31f3ce55a72-kube-api-access-ptllj\") pod \"mariadb-client-5-default\" (UID: \"1b9ed389-bbc9-4d60-a52f-f31f3ce55a72\") " pod="openstack/mariadb-client-5-default" Nov 26 14:51:19 crc kubenswrapper[5200]: I1126 14:51:19.422236 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 26 14:51:20 crc kubenswrapper[5200]: I1126 14:51:20.022274 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 26 14:51:20 crc kubenswrapper[5200]: W1126 14:51:20.030960 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b9ed389_bbc9_4d60_a52f_f31f3ce55a72.slice/crio-e90768372c0e960e63f5e4e11c9405ca8e4857a5188438529803fb95e721ae0c WatchSource:0}: Error finding container e90768372c0e960e63f5e4e11c9405ca8e4857a5188438529803fb95e721ae0c: Status 404 returned error can't find the container with id e90768372c0e960e63f5e4e11c9405ca8e4857a5188438529803fb95e721ae0c Nov 26 14:51:20 crc kubenswrapper[5200]: I1126 14:51:20.191191 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"1b9ed389-bbc9-4d60-a52f-f31f3ce55a72","Type":"ContainerStarted","Data":"e90768372c0e960e63f5e4e11c9405ca8e4857a5188438529803fb95e721ae0c"} Nov 26 14:51:21 crc kubenswrapper[5200]: I1126 14:51:21.203982 5200 generic.go:358] "Generic (PLEG): container finished" podID="1b9ed389-bbc9-4d60-a52f-f31f3ce55a72" containerID="9c94a2aa2ba0fbc08ba5c711b05f65760b05d560597a29f3bc836cdfddbafc7e" exitCode=0 Nov 26 14:51:21 crc kubenswrapper[5200]: I1126 14:51:21.204057 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"1b9ed389-bbc9-4d60-a52f-f31f3ce55a72","Type":"ContainerDied","Data":"9c94a2aa2ba0fbc08ba5c711b05f65760b05d560597a29f3bc836cdfddbafc7e"} Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.562236 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.588530 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_1b9ed389-bbc9-4d60-a52f-f31f3ce55a72/mariadb-client-5-default/0.log" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.616149 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptllj\" (UniqueName: \"kubernetes.io/projected/1b9ed389-bbc9-4d60-a52f-f31f3ce55a72-kube-api-access-ptllj\") pod \"1b9ed389-bbc9-4d60-a52f-f31f3ce55a72\" (UID: \"1b9ed389-bbc9-4d60-a52f-f31f3ce55a72\") " Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.618904 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.622351 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9ed389-bbc9-4d60-a52f-f31f3ce55a72-kube-api-access-ptllj" (OuterVolumeSpecName: "kube-api-access-ptllj") pod "1b9ed389-bbc9-4d60-a52f-f31f3ce55a72" (UID: "1b9ed389-bbc9-4d60-a52f-f31f3ce55a72"). InnerVolumeSpecName "kube-api-access-ptllj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.626709 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.718325 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptllj\" (UniqueName: \"kubernetes.io/projected/1b9ed389-bbc9-4d60-a52f-f31f3ce55a72-kube-api-access-ptllj\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.773727 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.774872 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b9ed389-bbc9-4d60-a52f-f31f3ce55a72" containerName="mariadb-client-5-default" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.774896 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9ed389-bbc9-4d60-a52f-f31f3ce55a72" containerName="mariadb-client-5-default" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.775031 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b9ed389-bbc9-4d60-a52f-f31f3ce55a72" containerName="mariadb-client-5-default" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.780056 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.781182 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.819568 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2pkd\" (UniqueName: \"kubernetes.io/projected/8ea22a31-7bce-4b17-a7f4-b9cf570215b4-kube-api-access-x2pkd\") pod \"mariadb-client-6-default\" (UID: \"8ea22a31-7bce-4b17-a7f4-b9cf570215b4\") " pod="openstack/mariadb-client-6-default" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.921466 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2pkd\" (UniqueName: \"kubernetes.io/projected/8ea22a31-7bce-4b17-a7f4-b9cf570215b4-kube-api-access-x2pkd\") pod \"mariadb-client-6-default\" (UID: \"8ea22a31-7bce-4b17-a7f4-b9cf570215b4\") " pod="openstack/mariadb-client-6-default" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.946274 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2pkd\" (UniqueName: \"kubernetes.io/projected/8ea22a31-7bce-4b17-a7f4-b9cf570215b4-kube-api-access-x2pkd\") pod \"mariadb-client-6-default\" (UID: \"8ea22a31-7bce-4b17-a7f4-b9cf570215b4\") " pod="openstack/mariadb-client-6-default" Nov 26 14:51:22 crc kubenswrapper[5200]: I1126 14:51:22.981142 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9ed389-bbc9-4d60-a52f-f31f3ce55a72" path="/var/lib/kubelet/pods/1b9ed389-bbc9-4d60-a52f-f31f3ce55a72/volumes" Nov 26 14:51:23 crc kubenswrapper[5200]: I1126 14:51:23.100902 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 26 14:51:23 crc kubenswrapper[5200]: I1126 14:51:23.223215 5200 scope.go:117] "RemoveContainer" containerID="9c94a2aa2ba0fbc08ba5c711b05f65760b05d560597a29f3bc836cdfddbafc7e" Nov 26 14:51:23 crc kubenswrapper[5200]: I1126 14:51:23.223216 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 26 14:51:23 crc kubenswrapper[5200]: I1126 14:51:23.646275 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 26 14:51:23 crc kubenswrapper[5200]: W1126 14:51:23.648518 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ea22a31_7bce_4b17_a7f4_b9cf570215b4.slice/crio-d1f43a01a55b697f188155ee8c2042890045148c2e939f124bb63d0518099fca WatchSource:0}: Error finding container d1f43a01a55b697f188155ee8c2042890045148c2e939f124bb63d0518099fca: Status 404 returned error can't find the container with id d1f43a01a55b697f188155ee8c2042890045148c2e939f124bb63d0518099fca Nov 26 14:51:24 crc kubenswrapper[5200]: I1126 14:51:24.241965 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8ea22a31-7bce-4b17-a7f4-b9cf570215b4","Type":"ContainerStarted","Data":"b8439903abf9004e97fe5b75234e1783e06687b80f8e5fab9e0292bd1e9c8bc0"} Nov 26 14:51:24 crc kubenswrapper[5200]: I1126 14:51:24.242016 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8ea22a31-7bce-4b17-a7f4-b9cf570215b4","Type":"ContainerStarted","Data":"d1f43a01a55b697f188155ee8c2042890045148c2e939f124bb63d0518099fca"} Nov 26 14:51:24 crc kubenswrapper[5200]: I1126 14:51:24.274631 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=2.274604744 podStartE2EDuration="2.274604744s" podCreationTimestamp="2025-11-26 14:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:24.263662205 +0000 UTC m=+4852.942864063" watchObservedRunningTime="2025-11-26 14:51:24.274604744 +0000 UTC m=+4852.953806562" Nov 26 14:51:24 crc kubenswrapper[5200]: I1126 14:51:24.361640 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_8ea22a31-7bce-4b17-a7f4-b9cf570215b4/mariadb-client-6-default/0.log" Nov 26 14:51:25 crc kubenswrapper[5200]: I1126 14:51:25.259774 5200 generic.go:358] "Generic (PLEG): container finished" podID="8ea22a31-7bce-4b17-a7f4-b9cf570215b4" containerID="b8439903abf9004e97fe5b75234e1783e06687b80f8e5fab9e0292bd1e9c8bc0" exitCode=1 Nov 26 14:51:25 crc kubenswrapper[5200]: I1126 14:51:25.259851 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8ea22a31-7bce-4b17-a7f4-b9cf570215b4","Type":"ContainerDied","Data":"b8439903abf9004e97fe5b75234e1783e06687b80f8e5fab9e0292bd1e9c8bc0"} Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.708488 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.751284 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.756788 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.789176 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2pkd\" (UniqueName: \"kubernetes.io/projected/8ea22a31-7bce-4b17-a7f4-b9cf570215b4-kube-api-access-x2pkd\") pod \"8ea22a31-7bce-4b17-a7f4-b9cf570215b4\" (UID: \"8ea22a31-7bce-4b17-a7f4-b9cf570215b4\") " Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.802534 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea22a31-7bce-4b17-a7f4-b9cf570215b4-kube-api-access-x2pkd" (OuterVolumeSpecName: "kube-api-access-x2pkd") pod "8ea22a31-7bce-4b17-a7f4-b9cf570215b4" (UID: "8ea22a31-7bce-4b17-a7f4-b9cf570215b4"). InnerVolumeSpecName "kube-api-access-x2pkd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.893149 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x2pkd\" (UniqueName: \"kubernetes.io/projected/8ea22a31-7bce-4b17-a7f4-b9cf570215b4-kube-api-access-x2pkd\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.906486 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.907610 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ea22a31-7bce-4b17-a7f4-b9cf570215b4" containerName="mariadb-client-6-default" Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.907629 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea22a31-7bce-4b17-a7f4-b9cf570215b4" containerName="mariadb-client-6-default" Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.907801 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ea22a31-7bce-4b17-a7f4-b9cf570215b4" containerName="mariadb-client-6-default" Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.914670 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.922782 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.982563 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea22a31-7bce-4b17-a7f4-b9cf570215b4" path="/var/lib/kubelet/pods/8ea22a31-7bce-4b17-a7f4-b9cf570215b4/volumes" Nov 26 14:51:26 crc kubenswrapper[5200]: I1126 14:51:26.995035 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7qpv\" (UniqueName: \"kubernetes.io/projected/5ef8c7ae-844b-4aae-86d8-33ec648c2824-kube-api-access-x7qpv\") pod \"mariadb-client-7-default\" (UID: \"5ef8c7ae-844b-4aae-86d8-33ec648c2824\") " pod="openstack/mariadb-client-7-default" Nov 26 14:51:27 crc kubenswrapper[5200]: I1126 14:51:27.097059 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7qpv\" (UniqueName: \"kubernetes.io/projected/5ef8c7ae-844b-4aae-86d8-33ec648c2824-kube-api-access-x7qpv\") pod \"mariadb-client-7-default\" (UID: \"5ef8c7ae-844b-4aae-86d8-33ec648c2824\") " pod="openstack/mariadb-client-7-default" Nov 26 14:51:27 crc kubenswrapper[5200]: I1126 14:51:27.119070 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7qpv\" (UniqueName: \"kubernetes.io/projected/5ef8c7ae-844b-4aae-86d8-33ec648c2824-kube-api-access-x7qpv\") pod \"mariadb-client-7-default\" (UID: \"5ef8c7ae-844b-4aae-86d8-33ec648c2824\") " pod="openstack/mariadb-client-7-default" Nov 26 14:51:27 crc kubenswrapper[5200]: I1126 14:51:27.243196 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 26 14:51:27 crc kubenswrapper[5200]: I1126 14:51:27.286317 5200 scope.go:117] "RemoveContainer" containerID="b8439903abf9004e97fe5b75234e1783e06687b80f8e5fab9e0292bd1e9c8bc0" Nov 26 14:51:27 crc kubenswrapper[5200]: I1126 14:51:27.286549 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 26 14:51:27 crc kubenswrapper[5200]: I1126 14:51:27.842616 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 26 14:51:27 crc kubenswrapper[5200]: W1126 14:51:27.848923 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ef8c7ae_844b_4aae_86d8_33ec648c2824.slice/crio-abe2fac1fbd1c852f8d6f9344c6d97ab10840d6de84d7e214e98b91796bc7d7b WatchSource:0}: Error finding container abe2fac1fbd1c852f8d6f9344c6d97ab10840d6de84d7e214e98b91796bc7d7b: Status 404 returned error can't find the container with id abe2fac1fbd1c852f8d6f9344c6d97ab10840d6de84d7e214e98b91796bc7d7b Nov 26 14:51:28 crc kubenswrapper[5200]: I1126 14:51:28.296532 5200 generic.go:358] "Generic (PLEG): container finished" podID="5ef8c7ae-844b-4aae-86d8-33ec648c2824" containerID="202e7d0816f03fc4fa5bec3180c5e92399ed830edb9dd083883eaa360bd0fd01" exitCode=0 Nov 26 14:51:28 crc kubenswrapper[5200]: I1126 14:51:28.296955 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"5ef8c7ae-844b-4aae-86d8-33ec648c2824","Type":"ContainerDied","Data":"202e7d0816f03fc4fa5bec3180c5e92399ed830edb9dd083883eaa360bd0fd01"} Nov 26 14:51:28 crc kubenswrapper[5200]: I1126 14:51:28.297367 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"5ef8c7ae-844b-4aae-86d8-33ec648c2824","Type":"ContainerStarted","Data":"abe2fac1fbd1c852f8d6f9344c6d97ab10840d6de84d7e214e98b91796bc7d7b"} Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.755641 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.773332 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_5ef8c7ae-844b-4aae-86d8-33ec648c2824/mariadb-client-7-default/0.log" Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.801141 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.819780 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.857912 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7qpv\" (UniqueName: \"kubernetes.io/projected/5ef8c7ae-844b-4aae-86d8-33ec648c2824-kube-api-access-x7qpv\") pod \"5ef8c7ae-844b-4aae-86d8-33ec648c2824\" (UID: \"5ef8c7ae-844b-4aae-86d8-33ec648c2824\") " Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.865404 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef8c7ae-844b-4aae-86d8-33ec648c2824-kube-api-access-x7qpv" (OuterVolumeSpecName: "kube-api-access-x7qpv") pod "5ef8c7ae-844b-4aae-86d8-33ec648c2824" (UID: "5ef8c7ae-844b-4aae-86d8-33ec648c2824"). InnerVolumeSpecName "kube-api-access-x7qpv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.946709 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.947989 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ef8c7ae-844b-4aae-86d8-33ec648c2824" containerName="mariadb-client-7-default" Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.948009 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef8c7ae-844b-4aae-86d8-33ec648c2824" containerName="mariadb-client-7-default" Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.948199 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ef8c7ae-844b-4aae-86d8-33ec648c2824" containerName="mariadb-client-7-default" Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.954403 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.954524 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 26 14:51:29 crc kubenswrapper[5200]: I1126 14:51:29.960011 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x7qpv\" (UniqueName: \"kubernetes.io/projected/5ef8c7ae-844b-4aae-86d8-33ec648c2824-kube-api-access-x7qpv\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.061406 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bclfs\" (UniqueName: \"kubernetes.io/projected/9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9-kube-api-access-bclfs\") pod \"mariadb-client-2\" (UID: \"9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9\") " pod="openstack/mariadb-client-2" Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.164677 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bclfs\" (UniqueName: \"kubernetes.io/projected/9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9-kube-api-access-bclfs\") pod \"mariadb-client-2\" (UID: \"9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9\") " pod="openstack/mariadb-client-2" Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.195510 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bclfs\" (UniqueName: \"kubernetes.io/projected/9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9-kube-api-access-bclfs\") pod \"mariadb-client-2\" (UID: \"9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9\") " pod="openstack/mariadb-client-2" Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.277875 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.324061 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abe2fac1fbd1c852f8d6f9344c6d97ab10840d6de84d7e214e98b91796bc7d7b" Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.324662 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.363047 5200 status_manager.go:895] "Failed to get status for pod" podUID="5ef8c7ae-844b-4aae-86d8-33ec648c2824" pod="openstack/mariadb-client-7-default" err="pods \"mariadb-client-7-default\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.702993 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Nov 26 14:51:30 crc kubenswrapper[5200]: W1126 14:51:30.711354 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dfb9c0b_1f99_45c9_8f2b_f3ea86b201d9.slice/crio-5e6d350761d614b1c12e09e17cb41f0a5e7e3e2ed2c5da65f30a5e9b66fb9b10 WatchSource:0}: Error finding container 5e6d350761d614b1c12e09e17cb41f0a5e7e3e2ed2c5da65f30a5e9b66fb9b10: Status 404 returned error can't find the container with id 5e6d350761d614b1c12e09e17cb41f0a5e7e3e2ed2c5da65f30a5e9b66fb9b10 Nov 26 14:51:30 crc kubenswrapper[5200]: I1126 14:51:30.990954 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef8c7ae-844b-4aae-86d8-33ec648c2824" path="/var/lib/kubelet/pods/5ef8c7ae-844b-4aae-86d8-33ec648c2824/volumes" Nov 26 14:51:31 crc kubenswrapper[5200]: I1126 14:51:31.340333 5200 generic.go:358] "Generic (PLEG): container finished" podID="9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9" containerID="a372b5a0dfcc36b71e3d7f4ccd0156e40d2ee131a8243ae15bf7ec47c103dd48" exitCode=0 Nov 26 14:51:31 crc kubenswrapper[5200]: I1126 14:51:31.340479 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9","Type":"ContainerDied","Data":"a372b5a0dfcc36b71e3d7f4ccd0156e40d2ee131a8243ae15bf7ec47c103dd48"} Nov 26 14:51:31 crc kubenswrapper[5200]: I1126 14:51:31.340564 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9","Type":"ContainerStarted","Data":"5e6d350761d614b1c12e09e17cb41f0a5e7e3e2ed2c5da65f30a5e9b66fb9b10"} Nov 26 14:51:32 crc kubenswrapper[5200]: I1126 14:51:32.936347 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 26 14:51:32 crc kubenswrapper[5200]: I1126 14:51:32.961381 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9/mariadb-client-2/0.log" Nov 26 14:51:32 crc kubenswrapper[5200]: I1126 14:51:32.987989 5200 status_manager.go:895] "Failed to get status for pod" podUID="9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9" pod="openstack/mariadb-client-2" err="pods \"mariadb-client-2\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.002070 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.002112 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.127936 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bclfs\" (UniqueName: \"kubernetes.io/projected/9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9-kube-api-access-bclfs\") pod \"9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9\" (UID: \"9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9\") " Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.136501 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9-kube-api-access-bclfs" (OuterVolumeSpecName: "kube-api-access-bclfs") pod "9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9" (UID: "9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9"). InnerVolumeSpecName "kube-api-access-bclfs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.154643 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.154864 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.229371 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bclfs\" (UniqueName: \"kubernetes.io/projected/9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9-kube-api-access-bclfs\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.373592 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e6d350761d614b1c12e09e17cb41f0a5e7e3e2ed2c5da65f30a5e9b66fb9b10" Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.373647 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 26 14:51:33 crc kubenswrapper[5200]: I1126 14:51:33.401128 5200 status_manager.go:895] "Failed to get status for pod" podUID="9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9" pod="openstack/mariadb-client-2" err="pods \"mariadb-client-2\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 14:51:34 crc kubenswrapper[5200]: I1126 14:51:34.985056 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9" path="/var/lib/kubelet/pods/9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9/volumes" Nov 26 14:51:42 crc kubenswrapper[5200]: E1126 14:51:42.100819 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1304226 actualBytes=10240 Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.154238 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.154961 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.155020 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.155768 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea3a2115a282f6ba849c903139ac342ea4d280f4ca9dbebec0216cb6655bfa03"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.155827 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://ea3a2115a282f6ba849c903139ac342ea4d280f4ca9dbebec0216cb6655bfa03" gracePeriod=600 Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.679409 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="ea3a2115a282f6ba849c903139ac342ea4d280f4ca9dbebec0216cb6655bfa03" exitCode=0 Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.679556 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"ea3a2115a282f6ba849c903139ac342ea4d280f4ca9dbebec0216cb6655bfa03"} Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.679909 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc"} Nov 26 14:52:03 crc kubenswrapper[5200]: I1126 14:52:03.679990 5200 scope.go:117] "RemoveContainer" containerID="7f8697a2cbf8b588db0e8aa078201ed5dffa39471e4d19ff4145167bd73e05aa" Nov 26 14:52:27 crc kubenswrapper[5200]: I1126 14:52:27.534512 5200 scope.go:117] "RemoveContainer" containerID="f650c3431ad11a5b3b370cade64429dabda9a6022b34217a543ae6d6f2bfc026" Nov 26 14:52:42 crc kubenswrapper[5200]: E1126 14:52:42.047214 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1302903 actualBytes=10240 Nov 26 14:53:42 crc kubenswrapper[5200]: E1126 14:53:42.117923 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1302903 actualBytes=10240 Nov 26 14:54:03 crc kubenswrapper[5200]: I1126 14:54:03.155214 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:54:03 crc kubenswrapper[5200]: I1126 14:54:03.156451 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:54:33 crc kubenswrapper[5200]: I1126 14:54:33.154297 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:54:33 crc kubenswrapper[5200]: I1126 14:54:33.155909 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:54:42 crc kubenswrapper[5200]: E1126 14:54:42.063914 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1302902 actualBytes=10240 Nov 26 14:54:49 crc kubenswrapper[5200]: I1126 14:54:49.836823 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d2795"] Nov 26 14:54:49 crc kubenswrapper[5200]: I1126 14:54:49.839167 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9" containerName="mariadb-client-2" Nov 26 14:54:49 crc kubenswrapper[5200]: I1126 14:54:49.839193 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9" containerName="mariadb-client-2" Nov 26 14:54:49 crc kubenswrapper[5200]: I1126 14:54:49.839487 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9dfb9c0b-1f99-45c9-8f2b-f3ea86b201d9" containerName="mariadb-client-2" Nov 26 14:54:49 crc kubenswrapper[5200]: I1126 14:54:49.945427 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2795"] Nov 26 14:54:49 crc kubenswrapper[5200]: I1126 14:54:49.945573 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.029850 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-catalog-content\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.030135 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mlj\" (UniqueName: \"kubernetes.io/projected/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-kube-api-access-c2mlj\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.030240 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-utilities\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.132005 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-catalog-content\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.132136 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mlj\" (UniqueName: \"kubernetes.io/projected/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-kube-api-access-c2mlj\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.132170 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-utilities\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.132584 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-catalog-content\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.132762 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-utilities\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.162664 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mlj\" (UniqueName: \"kubernetes.io/projected/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-kube-api-access-c2mlj\") pod \"redhat-operators-d2795\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.278759 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.791368 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2795"] Nov 26 14:54:50 crc kubenswrapper[5200]: I1126 14:54:50.796671 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:54:51 crc kubenswrapper[5200]: I1126 14:54:51.311554 5200 generic.go:358] "Generic (PLEG): container finished" podID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerID="c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab" exitCode=0 Nov 26 14:54:51 crc kubenswrapper[5200]: I1126 14:54:51.311619 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2795" event={"ID":"f15e26ce-61c0-4c5f-a6b2-67b07960aa03","Type":"ContainerDied","Data":"c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab"} Nov 26 14:54:51 crc kubenswrapper[5200]: I1126 14:54:51.311648 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2795" event={"ID":"f15e26ce-61c0-4c5f-a6b2-67b07960aa03","Type":"ContainerStarted","Data":"121ded7668009ab8f26cfa624b36f2f40ab3eb2685b8eabea721731af57d78ed"} Nov 26 14:54:53 crc kubenswrapper[5200]: I1126 14:54:53.328894 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2795" event={"ID":"f15e26ce-61c0-4c5f-a6b2-67b07960aa03","Type":"ContainerStarted","Data":"1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93"} Nov 26 14:54:54 crc kubenswrapper[5200]: I1126 14:54:54.342844 5200 generic.go:358] "Generic (PLEG): container finished" podID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerID="1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93" exitCode=0 Nov 26 14:54:54 crc kubenswrapper[5200]: I1126 14:54:54.343019 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2795" event={"ID":"f15e26ce-61c0-4c5f-a6b2-67b07960aa03","Type":"ContainerDied","Data":"1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93"} Nov 26 14:54:55 crc kubenswrapper[5200]: I1126 14:54:55.355561 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2795" event={"ID":"f15e26ce-61c0-4c5f-a6b2-67b07960aa03","Type":"ContainerStarted","Data":"dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3"} Nov 26 14:54:55 crc kubenswrapper[5200]: I1126 14:54:55.378818 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d2795" podStartSLOduration=4.995066702 podStartE2EDuration="6.378798523s" podCreationTimestamp="2025-11-26 14:54:49 +0000 UTC" firstStartedPulling="2025-11-26 14:54:51.313479764 +0000 UTC m=+5059.992681582" lastFinishedPulling="2025-11-26 14:54:52.697211555 +0000 UTC m=+5061.376413403" observedRunningTime="2025-11-26 14:54:55.377142999 +0000 UTC m=+5064.056344847" watchObservedRunningTime="2025-11-26 14:54:55.378798523 +0000 UTC m=+5064.058000341" Nov 26 14:54:58 crc kubenswrapper[5200]: I1126 14:54:58.896144 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tq8zl"] Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.041473 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.051202 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tq8zl"] Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.098983 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj9h8\" (UniqueName: \"kubernetes.io/projected/86609561-9f57-4755-829a-f4f24aa2ef2c-kube-api-access-wj9h8\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.099034 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-catalog-content\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.099070 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-utilities\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.200400 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj9h8\" (UniqueName: \"kubernetes.io/projected/86609561-9f57-4755-829a-f4f24aa2ef2c-kube-api-access-wj9h8\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.200443 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-catalog-content\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.200474 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-utilities\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.200958 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-utilities\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.201133 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-catalog-content\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.232025 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj9h8\" (UniqueName: \"kubernetes.io/projected/86609561-9f57-4755-829a-f4f24aa2ef2c-kube-api-access-wj9h8\") pod \"certified-operators-tq8zl\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.358980 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:54:59 crc kubenswrapper[5200]: I1126 14:54:59.794389 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tq8zl"] Nov 26 14:55:00 crc kubenswrapper[5200]: I1126 14:55:00.279495 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:55:00 crc kubenswrapper[5200]: I1126 14:55:00.280128 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:55:00 crc kubenswrapper[5200]: I1126 14:55:00.335227 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:55:00 crc kubenswrapper[5200]: I1126 14:55:00.415607 5200 generic.go:358] "Generic (PLEG): container finished" podID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerID="3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b" exitCode=0 Nov 26 14:55:00 crc kubenswrapper[5200]: I1126 14:55:00.415959 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tq8zl" event={"ID":"86609561-9f57-4755-829a-f4f24aa2ef2c","Type":"ContainerDied","Data":"3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b"} Nov 26 14:55:00 crc kubenswrapper[5200]: I1126 14:55:00.415993 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tq8zl" event={"ID":"86609561-9f57-4755-829a-f4f24aa2ef2c","Type":"ContainerStarted","Data":"eb9d643be006b09a9c12c3874165b93a20642ec202657675511762ff30818877"} Nov 26 14:55:00 crc kubenswrapper[5200]: I1126 14:55:00.478320 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:55:02 crc kubenswrapper[5200]: I1126 14:55:02.436422 5200 generic.go:358] "Generic (PLEG): container finished" podID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerID="e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286" exitCode=0 Nov 26 14:55:02 crc kubenswrapper[5200]: I1126 14:55:02.436518 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tq8zl" event={"ID":"86609561-9f57-4755-829a-f4f24aa2ef2c","Type":"ContainerDied","Data":"e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286"} Nov 26 14:55:02 crc kubenswrapper[5200]: I1126 14:55:02.663640 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d2795"] Nov 26 14:55:02 crc kubenswrapper[5200]: I1126 14:55:02.663947 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d2795" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerName="registry-server" containerID="cri-o://dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3" gracePeriod=2 Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.110277 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.154265 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.154535 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.154640 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.155441 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.155607 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" gracePeriod=600 Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.276322 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-catalog-content\") pod \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.276413 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-utilities\") pod \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.276461 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2mlj\" (UniqueName: \"kubernetes.io/projected/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-kube-api-access-c2mlj\") pod \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\" (UID: \"f15e26ce-61c0-4c5f-a6b2-67b07960aa03\") " Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.278780 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-utilities" (OuterVolumeSpecName: "utilities") pod "f15e26ce-61c0-4c5f-a6b2-67b07960aa03" (UID: "f15e26ce-61c0-4c5f-a6b2-67b07960aa03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:55:03 crc kubenswrapper[5200]: E1126 14:55:03.279245 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.284849 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-kube-api-access-c2mlj" (OuterVolumeSpecName: "kube-api-access-c2mlj") pod "f15e26ce-61c0-4c5f-a6b2-67b07960aa03" (UID: "f15e26ce-61c0-4c5f-a6b2-67b07960aa03"). InnerVolumeSpecName "kube-api-access-c2mlj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.366816 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f15e26ce-61c0-4c5f-a6b2-67b07960aa03" (UID: "f15e26ce-61c0-4c5f-a6b2-67b07960aa03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.379201 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.379236 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.379247 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c2mlj\" (UniqueName: \"kubernetes.io/projected/f15e26ce-61c0-4c5f-a6b2-67b07960aa03-kube-api-access-c2mlj\") on node \"crc\" DevicePath \"\"" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.447166 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" exitCode=0 Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.447212 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc"} Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.447262 5200 scope.go:117] "RemoveContainer" containerID="ea3a2115a282f6ba849c903139ac342ea4d280f4ca9dbebec0216cb6655bfa03" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.447789 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:55:03 crc kubenswrapper[5200]: E1126 14:55:03.448049 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.451357 5200 generic.go:358] "Generic (PLEG): container finished" podID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerID="dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3" exitCode=0 Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.451446 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2795" event={"ID":"f15e26ce-61c0-4c5f-a6b2-67b07960aa03","Type":"ContainerDied","Data":"dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3"} Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.451469 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2795" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.451490 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2795" event={"ID":"f15e26ce-61c0-4c5f-a6b2-67b07960aa03","Type":"ContainerDied","Data":"121ded7668009ab8f26cfa624b36f2f40ab3eb2685b8eabea721731af57d78ed"} Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.455005 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tq8zl" event={"ID":"86609561-9f57-4755-829a-f4f24aa2ef2c","Type":"ContainerStarted","Data":"dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21"} Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.495145 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tq8zl" podStartSLOduration=4.6484226060000005 podStartE2EDuration="5.495130065s" podCreationTimestamp="2025-11-26 14:54:58 +0000 UTC" firstStartedPulling="2025-11-26 14:55:00.418274593 +0000 UTC m=+5069.097476421" lastFinishedPulling="2025-11-26 14:55:01.264982052 +0000 UTC m=+5069.944183880" observedRunningTime="2025-11-26 14:55:03.491624792 +0000 UTC m=+5072.170826620" watchObservedRunningTime="2025-11-26 14:55:03.495130065 +0000 UTC m=+5072.174331883" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.508070 5200 scope.go:117] "RemoveContainer" containerID="dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.531323 5200 scope.go:117] "RemoveContainer" containerID="1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.532983 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d2795"] Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.540737 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d2795"] Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.559792 5200 scope.go:117] "RemoveContainer" containerID="c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.579500 5200 scope.go:117] "RemoveContainer" containerID="dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3" Nov 26 14:55:03 crc kubenswrapper[5200]: E1126 14:55:03.580020 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3\": container with ID starting with dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3 not found: ID does not exist" containerID="dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.580063 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3"} err="failed to get container status \"dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3\": rpc error: code = NotFound desc = could not find container \"dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3\": container with ID starting with dc7d4223d46f5015cbd2b97cdf59f58d970b24480a0136c3abd88bb4c89c05f3 not found: ID does not exist" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.580088 5200 scope.go:117] "RemoveContainer" containerID="1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93" Nov 26 14:55:03 crc kubenswrapper[5200]: E1126 14:55:03.580457 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93\": container with ID starting with 1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93 not found: ID does not exist" containerID="1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.580488 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93"} err="failed to get container status \"1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93\": rpc error: code = NotFound desc = could not find container \"1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93\": container with ID starting with 1c4b32da512df081a68e24206fa72c9b6862d12095e7facdc49f98eb695d3d93 not found: ID does not exist" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.580506 5200 scope.go:117] "RemoveContainer" containerID="c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab" Nov 26 14:55:03 crc kubenswrapper[5200]: E1126 14:55:03.580889 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab\": container with ID starting with c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab not found: ID does not exist" containerID="c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab" Nov 26 14:55:03 crc kubenswrapper[5200]: I1126 14:55:03.580914 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab"} err="failed to get container status \"c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab\": rpc error: code = NotFound desc = could not find container \"c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab\": container with ID starting with c6ac1ead13fd6db3bd31de5237421a07805e107126cbdfb4bc30dacc94acb1ab not found: ID does not exist" Nov 26 14:55:04 crc kubenswrapper[5200]: I1126 14:55:04.977976 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" path="/var/lib/kubelet/pods/f15e26ce-61c0-4c5f-a6b2-67b07960aa03/volumes" Nov 26 14:55:09 crc kubenswrapper[5200]: I1126 14:55:09.359713 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:55:09 crc kubenswrapper[5200]: I1126 14:55:09.360216 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:55:09 crc kubenswrapper[5200]: I1126 14:55:09.401437 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:55:09 crc kubenswrapper[5200]: I1126 14:55:09.544759 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:55:09 crc kubenswrapper[5200]: I1126 14:55:09.633422 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tq8zl"] Nov 26 14:55:11 crc kubenswrapper[5200]: I1126 14:55:11.518551 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tq8zl" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerName="registry-server" containerID="cri-o://dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21" gracePeriod=2 Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.456702 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.524209 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj9h8\" (UniqueName: \"kubernetes.io/projected/86609561-9f57-4755-829a-f4f24aa2ef2c-kube-api-access-wj9h8\") pod \"86609561-9f57-4755-829a-f4f24aa2ef2c\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.524449 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-catalog-content\") pod \"86609561-9f57-4755-829a-f4f24aa2ef2c\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.524515 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-utilities\") pod \"86609561-9f57-4755-829a-f4f24aa2ef2c\" (UID: \"86609561-9f57-4755-829a-f4f24aa2ef2c\") " Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.527511 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-utilities" (OuterVolumeSpecName: "utilities") pod "86609561-9f57-4755-829a-f4f24aa2ef2c" (UID: "86609561-9f57-4755-829a-f4f24aa2ef2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.531947 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86609561-9f57-4755-829a-f4f24aa2ef2c-kube-api-access-wj9h8" (OuterVolumeSpecName: "kube-api-access-wj9h8") pod "86609561-9f57-4755-829a-f4f24aa2ef2c" (UID: "86609561-9f57-4755-829a-f4f24aa2ef2c"). InnerVolumeSpecName "kube-api-access-wj9h8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.539715 5200 generic.go:358] "Generic (PLEG): container finished" podID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerID="dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21" exitCode=0 Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.540212 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tq8zl" event={"ID":"86609561-9f57-4755-829a-f4f24aa2ef2c","Type":"ContainerDied","Data":"dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21"} Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.540321 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tq8zl" event={"ID":"86609561-9f57-4755-829a-f4f24aa2ef2c","Type":"ContainerDied","Data":"eb9d643be006b09a9c12c3874165b93a20642ec202657675511762ff30818877"} Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.540413 5200 scope.go:117] "RemoveContainer" containerID="dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.540629 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tq8zl" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.552847 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86609561-9f57-4755-829a-f4f24aa2ef2c" (UID: "86609561-9f57-4755-829a-f4f24aa2ef2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.566796 5200 scope.go:117] "RemoveContainer" containerID="e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.582733 5200 scope.go:117] "RemoveContainer" containerID="3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.613491 5200 scope.go:117] "RemoveContainer" containerID="dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21" Nov 26 14:55:12 crc kubenswrapper[5200]: E1126 14:55:12.614288 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21\": container with ID starting with dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21 not found: ID does not exist" containerID="dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.614347 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21"} err="failed to get container status \"dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21\": rpc error: code = NotFound desc = could not find container \"dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21\": container with ID starting with dd133602269f509f4f211da2a514ea51055e9e3e2c6c4be706b89dd95f401a21 not found: ID does not exist" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.614378 5200 scope.go:117] "RemoveContainer" containerID="e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286" Nov 26 14:55:12 crc kubenswrapper[5200]: E1126 14:55:12.614884 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286\": container with ID starting with e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286 not found: ID does not exist" containerID="e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.614934 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286"} err="failed to get container status \"e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286\": rpc error: code = NotFound desc = could not find container \"e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286\": container with ID starting with e89d36e1cc973f8a8ff7992c9234265c3bc8cc3e3239555d4ed084927e55c286 not found: ID does not exist" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.614952 5200 scope.go:117] "RemoveContainer" containerID="3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b" Nov 26 14:55:12 crc kubenswrapper[5200]: E1126 14:55:12.615469 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b\": container with ID starting with 3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b not found: ID does not exist" containerID="3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.615493 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b"} err="failed to get container status \"3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b\": rpc error: code = NotFound desc = could not find container \"3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b\": container with ID starting with 3c1ff3acf9726c30198e6b642f3c384ea95d15e171f50156ee8e9acd9658033b not found: ID does not exist" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.626755 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.626778 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86609561-9f57-4755-829a-f4f24aa2ef2c-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.626788 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wj9h8\" (UniqueName: \"kubernetes.io/projected/86609561-9f57-4755-829a-f4f24aa2ef2c-kube-api-access-wj9h8\") on node \"crc\" DevicePath \"\"" Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.882761 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tq8zl"] Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.891225 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tq8zl"] Nov 26 14:55:12 crc kubenswrapper[5200]: I1126 14:55:12.978279 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" path="/var/lib/kubelet/pods/86609561-9f57-4755-829a-f4f24aa2ef2c/volumes" Nov 26 14:55:13 crc kubenswrapper[5200]: I1126 14:55:13.969645 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:55:13 crc kubenswrapper[5200]: E1126 14:55:13.970095 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:55:27 crc kubenswrapper[5200]: I1126 14:55:27.969223 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:55:27 crc kubenswrapper[5200]: E1126 14:55:27.971140 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:55:36 crc kubenswrapper[5200]: I1126 14:55:36.891375 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:55:36 crc kubenswrapper[5200]: I1126 14:55:36.891423 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 14:55:36 crc kubenswrapper[5200]: I1126 14:55:36.897744 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:55:36 crc kubenswrapper[5200]: I1126 14:55:36.898357 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.491723 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493753 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerName="registry-server" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493786 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerName="registry-server" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493818 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerName="registry-server" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493825 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerName="registry-server" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493840 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerName="extract-content" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493847 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerName="extract-content" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493863 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerName="extract-utilities" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493869 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerName="extract-utilities" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493888 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerName="extract-content" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493894 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerName="extract-content" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493910 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerName="extract-utilities" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.493915 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerName="extract-utilities" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.494074 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f15e26ce-61c0-4c5f-a6b2-67b07960aa03" containerName="registry-server" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.494086 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="86609561-9f57-4755-829a-f4f24aa2ef2c" containerName="registry-server" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.506654 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.506831 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.513146 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"default-dockercfg-8vwh2\"" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.579254 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlnnp\" (UniqueName: \"kubernetes.io/projected/c13a2e17-c871-4417-80b1-e1c60cf6d64c-kube-api-access-jlnnp\") pod \"mariadb-copy-data\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") " pod="openstack/mariadb-copy-data" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.579559 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\") pod \"mariadb-copy-data\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") " pod="openstack/mariadb-copy-data" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.680773 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\") pod \"mariadb-copy-data\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") " pod="openstack/mariadb-copy-data" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.680864 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlnnp\" (UniqueName: \"kubernetes.io/projected/c13a2e17-c871-4417-80b1-e1c60cf6d64c-kube-api-access-jlnnp\") pod \"mariadb-copy-data\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") " pod="openstack/mariadb-copy-data" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.684179 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.684224 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\") pod \"mariadb-copy-data\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8d9d53f035355f54914c9034c381d398fbe75d684c932948cac80345f0de7421/globalmount\"" pod="openstack/mariadb-copy-data" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.706346 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlnnp\" (UniqueName: \"kubernetes.io/projected/c13a2e17-c871-4417-80b1-e1c60cf6d64c-kube-api-access-jlnnp\") pod \"mariadb-copy-data\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") " pod="openstack/mariadb-copy-data" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.713576 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\") pod \"mariadb-copy-data\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") " pod="openstack/mariadb-copy-data" Nov 26 14:55:38 crc kubenswrapper[5200]: I1126 14:55:38.839185 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 26 14:55:39 crc kubenswrapper[5200]: I1126 14:55:39.358600 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Nov 26 14:55:39 crc kubenswrapper[5200]: I1126 14:55:39.771314 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c13a2e17-c871-4417-80b1-e1c60cf6d64c","Type":"ContainerStarted","Data":"0dac33125f4702659e50f0807691fe4181adc25b9fe47c074981155bbe9e189d"} Nov 26 14:55:39 crc kubenswrapper[5200]: I1126 14:55:39.771933 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c13a2e17-c871-4417-80b1-e1c60cf6d64c","Type":"ContainerStarted","Data":"84039a7c1a282518b151f3debcdb075cd8e7ae197d6795c83b4230064aff255b"} Nov 26 14:55:40 crc kubenswrapper[5200]: I1126 14:55:40.793572 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.7935458730000002 podStartE2EDuration="3.793545873s" podCreationTimestamp="2025-11-26 14:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:55:40.791029737 +0000 UTC m=+5109.470231555" watchObservedRunningTime="2025-11-26 14:55:40.793545873 +0000 UTC m=+5109.472747691" Nov 26 14:55:41 crc kubenswrapper[5200]: I1126 14:55:41.970069 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:55:41 crc kubenswrapper[5200]: E1126 14:55:41.970892 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:55:42 crc kubenswrapper[5200]: E1126 14:55:42.040635 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1310715 actualBytes=10240 Nov 26 14:55:43 crc kubenswrapper[5200]: I1126 14:55:43.606998 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:43 crc kubenswrapper[5200]: I1126 14:55:43.645954 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:43 crc kubenswrapper[5200]: I1126 14:55:43.646616 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 26 14:55:43 crc kubenswrapper[5200]: I1126 14:55:43.770591 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvjrl\" (UniqueName: \"kubernetes.io/projected/027fa604-2d4b-4dac-868d-228d8a2d7270-kube-api-access-cvjrl\") pod \"mariadb-client\" (UID: \"027fa604-2d4b-4dac-868d-228d8a2d7270\") " pod="openstack/mariadb-client" Nov 26 14:55:43 crc kubenswrapper[5200]: I1126 14:55:43.871798 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvjrl\" (UniqueName: \"kubernetes.io/projected/027fa604-2d4b-4dac-868d-228d8a2d7270-kube-api-access-cvjrl\") pod \"mariadb-client\" (UID: \"027fa604-2d4b-4dac-868d-228d8a2d7270\") " pod="openstack/mariadb-client" Nov 26 14:55:43 crc kubenswrapper[5200]: I1126 14:55:43.893193 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvjrl\" (UniqueName: \"kubernetes.io/projected/027fa604-2d4b-4dac-868d-228d8a2d7270-kube-api-access-cvjrl\") pod \"mariadb-client\" (UID: \"027fa604-2d4b-4dac-868d-228d8a2d7270\") " pod="openstack/mariadb-client" Nov 26 14:55:43 crc kubenswrapper[5200]: I1126 14:55:43.969407 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 26 14:55:44 crc kubenswrapper[5200]: I1126 14:55:44.378289 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:44 crc kubenswrapper[5200]: W1126 14:55:44.383675 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027fa604_2d4b_4dac_868d_228d8a2d7270.slice/crio-8fbd37df23535d9e5d9488e6424ba1407e123c9079e7b13075d0220184e4d046 WatchSource:0}: Error finding container 8fbd37df23535d9e5d9488e6424ba1407e123c9079e7b13075d0220184e4d046: Status 404 returned error can't find the container with id 8fbd37df23535d9e5d9488e6424ba1407e123c9079e7b13075d0220184e4d046 Nov 26 14:55:44 crc kubenswrapper[5200]: I1126 14:55:44.814962 5200 generic.go:358] "Generic (PLEG): container finished" podID="027fa604-2d4b-4dac-868d-228d8a2d7270" containerID="c723b510ab6f60d75d696fad653e16b16f43f948b48f8a41054f34bb53558583" exitCode=0 Nov 26 14:55:44 crc kubenswrapper[5200]: I1126 14:55:44.815026 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"027fa604-2d4b-4dac-868d-228d8a2d7270","Type":"ContainerDied","Data":"c723b510ab6f60d75d696fad653e16b16f43f948b48f8a41054f34bb53558583"} Nov 26 14:55:44 crc kubenswrapper[5200]: I1126 14:55:44.815431 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"027fa604-2d4b-4dac-868d-228d8a2d7270","Type":"ContainerStarted","Data":"8fbd37df23535d9e5d9488e6424ba1407e123c9079e7b13075d0220184e4d046"} Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.146804 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.170749 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_027fa604-2d4b-4dac-868d-228d8a2d7270/mariadb-client/0.log" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.198515 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.204442 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvjrl\" (UniqueName: \"kubernetes.io/projected/027fa604-2d4b-4dac-868d-228d8a2d7270-kube-api-access-cvjrl\") pod \"027fa604-2d4b-4dac-868d-228d8a2d7270\" (UID: \"027fa604-2d4b-4dac-868d-228d8a2d7270\") " Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.206006 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.209544 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027fa604-2d4b-4dac-868d-228d8a2d7270-kube-api-access-cvjrl" (OuterVolumeSpecName: "kube-api-access-cvjrl") pod "027fa604-2d4b-4dac-868d-228d8a2d7270" (UID: "027fa604-2d4b-4dac-868d-228d8a2d7270"). InnerVolumeSpecName "kube-api-access-cvjrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.306413 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cvjrl\" (UniqueName: \"kubernetes.io/projected/027fa604-2d4b-4dac-868d-228d8a2d7270-kube-api-access-cvjrl\") on node \"crc\" DevicePath \"\"" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.343644 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.345631 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="027fa604-2d4b-4dac-868d-228d8a2d7270" containerName="mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.345669 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="027fa604-2d4b-4dac-868d-228d8a2d7270" containerName="mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.346233 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="027fa604-2d4b-4dac-868d-228d8a2d7270" containerName="mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.484342 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.484469 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.509349 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2twv\" (UniqueName: \"kubernetes.io/projected/551d2c53-7f20-4c20-aef2-1ac0b37a3bf1-kube-api-access-q2twv\") pod \"mariadb-client\" (UID: \"551d2c53-7f20-4c20-aef2-1ac0b37a3bf1\") " pod="openstack/mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.610565 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2twv\" (UniqueName: \"kubernetes.io/projected/551d2c53-7f20-4c20-aef2-1ac0b37a3bf1-kube-api-access-q2twv\") pod \"mariadb-client\" (UID: \"551d2c53-7f20-4c20-aef2-1ac0b37a3bf1\") " pod="openstack/mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.636536 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2twv\" (UniqueName: \"kubernetes.io/projected/551d2c53-7f20-4c20-aef2-1ac0b37a3bf1-kube-api-access-q2twv\") pod \"mariadb-client\" (UID: \"551d2c53-7f20-4c20-aef2-1ac0b37a3bf1\") " pod="openstack/mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.809994 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.846990 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.847085 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fbd37df23535d9e5d9488e6424ba1407e123c9079e7b13075d0220184e4d046" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.875141 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="027fa604-2d4b-4dac-868d-228d8a2d7270" podUID="551d2c53-7f20-4c20-aef2-1ac0b37a3bf1" Nov 26 14:55:46 crc kubenswrapper[5200]: I1126 14:55:46.979988 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027fa604-2d4b-4dac-868d-228d8a2d7270" path="/var/lib/kubelet/pods/027fa604-2d4b-4dac-868d-228d8a2d7270/volumes" Nov 26 14:55:47 crc kubenswrapper[5200]: I1126 14:55:47.329300 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:47 crc kubenswrapper[5200]: W1126 14:55:47.336522 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod551d2c53_7f20_4c20_aef2_1ac0b37a3bf1.slice/crio-d45043537aafed5303ee832d115e7f736a507ea34b785c10c3495d1915e9abd7 WatchSource:0}: Error finding container d45043537aafed5303ee832d115e7f736a507ea34b785c10c3495d1915e9abd7: Status 404 returned error can't find the container with id d45043537aafed5303ee832d115e7f736a507ea34b785c10c3495d1915e9abd7 Nov 26 14:55:47 crc kubenswrapper[5200]: I1126 14:55:47.861406 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"551d2c53-7f20-4c20-aef2-1ac0b37a3bf1","Type":"ContainerStarted","Data":"5738e47ee40ac9d3049af9fa752126b776493e6b2d135e85890cb16e0c01fb37"} Nov 26 14:55:47 crc kubenswrapper[5200]: I1126 14:55:47.861775 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"551d2c53-7f20-4c20-aef2-1ac0b37a3bf1","Type":"ContainerStarted","Data":"d45043537aafed5303ee832d115e7f736a507ea34b785c10c3495d1915e9abd7"} Nov 26 14:55:48 crc kubenswrapper[5200]: I1126 14:55:48.873286 5200 generic.go:358] "Generic (PLEG): container finished" podID="551d2c53-7f20-4c20-aef2-1ac0b37a3bf1" containerID="5738e47ee40ac9d3049af9fa752126b776493e6b2d135e85890cb16e0c01fb37" exitCode=0 Nov 26 14:55:48 crc kubenswrapper[5200]: I1126 14:55:48.873392 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"551d2c53-7f20-4c20-aef2-1ac0b37a3bf1","Type":"ContainerDied","Data":"5738e47ee40ac9d3049af9fa752126b776493e6b2d135e85890cb16e0c01fb37"} Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.251584 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.270396 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2twv\" (UniqueName: \"kubernetes.io/projected/551d2c53-7f20-4c20-aef2-1ac0b37a3bf1-kube-api-access-q2twv\") pod \"551d2c53-7f20-4c20-aef2-1ac0b37a3bf1\" (UID: \"551d2c53-7f20-4c20-aef2-1ac0b37a3bf1\") " Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.300475 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551d2c53-7f20-4c20-aef2-1ac0b37a3bf1-kube-api-access-q2twv" (OuterVolumeSpecName: "kube-api-access-q2twv") pod "551d2c53-7f20-4c20-aef2-1ac0b37a3bf1" (UID: "551d2c53-7f20-4c20-aef2-1ac0b37a3bf1"). InnerVolumeSpecName "kube-api-access-q2twv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.305113 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_551d2c53-7f20-4c20-aef2-1ac0b37a3bf1/mariadb-client/0.log" Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.334172 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.339080 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.372476 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q2twv\" (UniqueName: \"kubernetes.io/projected/551d2c53-7f20-4c20-aef2-1ac0b37a3bf1-kube-api-access-q2twv\") on node \"crc\" DevicePath \"\"" Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.897289 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.897336 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45043537aafed5303ee832d115e7f736a507ea34b785c10c3495d1915e9abd7" Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.914434 5200 status_manager.go:895] "Failed to get status for pod" podUID="551d2c53-7f20-4c20-aef2-1ac0b37a3bf1" pod="openstack/mariadb-client" err="pods \"mariadb-client\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 14:55:50 crc kubenswrapper[5200]: I1126 14:55:50.980222 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551d2c53-7f20-4c20-aef2-1ac0b37a3bf1" path="/var/lib/kubelet/pods/551d2c53-7f20-4c20-aef2-1ac0b37a3bf1/volumes" Nov 26 14:55:52 crc kubenswrapper[5200]: I1126 14:55:52.975366 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:55:52 crc kubenswrapper[5200]: E1126 14:55:52.975998 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:56:07 crc kubenswrapper[5200]: I1126 14:56:07.969762 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:56:07 crc kubenswrapper[5200]: E1126 14:56:07.970483 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:56:18 crc kubenswrapper[5200]: I1126 14:56:18.969742 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:56:18 crc kubenswrapper[5200]: E1126 14:56:18.970888 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:56:32 crc kubenswrapper[5200]: I1126 14:56:32.982474 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:56:32 crc kubenswrapper[5200]: E1126 14:56:32.983186 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:56:41 crc kubenswrapper[5200]: E1126 14:56:41.896711 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1310713 actualBytes=10240 Nov 26 14:56:45 crc kubenswrapper[5200]: I1126 14:56:45.969328 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:56:45 crc kubenswrapper[5200]: E1126 14:56:45.970765 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:56:57 crc kubenswrapper[5200]: I1126 14:56:57.969278 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:56:57 crc kubenswrapper[5200]: E1126 14:56:57.970325 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.604561 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.605915 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="551d2c53-7f20-4c20-aef2-1ac0b37a3bf1" containerName="mariadb-client" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.605931 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="551d2c53-7f20-4c20-aef2-1ac0b37a3bf1" containerName="mariadb-client" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.606103 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="551d2c53-7f20-4c20-aef2-1ac0b37a3bf1" containerName="mariadb-client" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.616990 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.621381 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.630368 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.644301 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.646106 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-nb-config\"" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.646441 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-nb-scripts\"" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.646643 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncluster-ovndbcluster-nb-dockercfg-48jxq\"" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.653007 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.667071 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.672363 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.703550 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.772291 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d667330-6af2-4a54-95da-7322f42da901-config\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.772682 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a4832a-90a2-4b28-a12a-0743b9f65bb6-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.772733 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-101989e3-e55b-4b04-af43-a4344fef2bc8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-101989e3-e55b-4b04-af43-a4344fef2bc8\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.772761 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z77v6\" (UniqueName: \"kubernetes.io/projected/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-kube-api-access-z77v6\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.772786 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a4832a-90a2-4b28-a12a-0743b9f65bb6-config\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.772810 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ee3b9cd-975e-4afb-a76e-f09cebae8a5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ee3b9cd-975e-4afb-a76e-f09cebae8a5e\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.772826 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44xj2\" (UniqueName: \"kubernetes.io/projected/1d667330-6af2-4a54-95da-7322f42da901-kube-api-access-44xj2\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.772846 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d667330-6af2-4a54-95da-7322f42da901-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773005 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a4832a-90a2-4b28-a12a-0743b9f65bb6-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773080 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-config\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773134 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773242 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d667330-6af2-4a54-95da-7322f42da901-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773324 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xrgc\" (UniqueName: \"kubernetes.io/projected/36a4832a-90a2-4b28-a12a-0743b9f65bb6-kube-api-access-4xrgc\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773386 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773418 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a4832a-90a2-4b28-a12a-0743b9f65bb6-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773458 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cf67a586-fb3c-4d16-b904-c3534c9adb03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf67a586-fb3c-4d16-b904-c3534c9adb03\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773520 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.773591 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d667330-6af2-4a54-95da-7322f42da901-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.803170 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.808657 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.813398 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncluster-ovndbcluster-sb-dockercfg-h8nk5\"" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.813712 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-sb-scripts\"" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.814418 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-sb-config\"" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.830811 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.838979 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.849186 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.849543 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.855046 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.858918 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.871087 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874729 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874779 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d667330-6af2-4a54-95da-7322f42da901-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874808 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xrgc\" (UniqueName: \"kubernetes.io/projected/36a4832a-90a2-4b28-a12a-0743b9f65bb6-kube-api-access-4xrgc\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874833 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874847 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a4832a-90a2-4b28-a12a-0743b9f65bb6-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874866 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-cf67a586-fb3c-4d16-b904-c3534c9adb03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf67a586-fb3c-4d16-b904-c3534c9adb03\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874898 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874929 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d667330-6af2-4a54-95da-7322f42da901-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874970 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d667330-6af2-4a54-95da-7322f42da901-config\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.874985 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a4832a-90a2-4b28-a12a-0743b9f65bb6-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.875003 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-101989e3-e55b-4b04-af43-a4344fef2bc8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-101989e3-e55b-4b04-af43-a4344fef2bc8\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.875024 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z77v6\" (UniqueName: \"kubernetes.io/projected/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-kube-api-access-z77v6\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.875055 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a4832a-90a2-4b28-a12a-0743b9f65bb6-config\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.875079 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-2ee3b9cd-975e-4afb-a76e-f09cebae8a5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ee3b9cd-975e-4afb-a76e-f09cebae8a5e\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.875097 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44xj2\" (UniqueName: \"kubernetes.io/projected/1d667330-6af2-4a54-95da-7322f42da901-kube-api-access-44xj2\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.875116 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d667330-6af2-4a54-95da-7322f42da901-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.875153 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a4832a-90a2-4b28-a12a-0743b9f65bb6-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.875175 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-config\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.876258 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.876375 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a4832a-90a2-4b28-a12a-0743b9f65bb6-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.876745 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-config\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.876834 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a4832a-90a2-4b28-a12a-0743b9f65bb6-config\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.877365 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.877631 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d667330-6af2-4a54-95da-7322f42da901-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.877830 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d667330-6af2-4a54-95da-7322f42da901-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.878002 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a4832a-90a2-4b28-a12a-0743b9f65bb6-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.878369 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.878459 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-cf67a586-fb3c-4d16-b904-c3534c9adb03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf67a586-fb3c-4d16-b904-c3534c9adb03\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9caddb8332d0f1fddc7d175e60106525aab9099fd496acd870486146e4e8fb4f/globalmount\"" pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.878763 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d667330-6af2-4a54-95da-7322f42da901-config\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.880161 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.880203 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-101989e3-e55b-4b04-af43-a4344fef2bc8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-101989e3-e55b-4b04-af43-a4344fef2bc8\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7d78d18321938f7704cd1092588358c2ca3efd5d53ca6a278d76e46e5b4747a2/globalmount\"" pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.880742 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.880776 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-2ee3b9cd-975e-4afb-a76e-f09cebae8a5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ee3b9cd-975e-4afb-a76e-f09cebae8a5e\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/781852e1511c42b79db385dbc5700e7c7a4912da397d521a4c8886f48d150ed6/globalmount\"" pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.895843 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z77v6\" (UniqueName: \"kubernetes.io/projected/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-kube-api-access-z77v6\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.903996 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d667330-6af2-4a54-95da-7322f42da901-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.917731 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a4832a-90a2-4b28-a12a-0743b9f65bb6-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.918414 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44xj2\" (UniqueName: \"kubernetes.io/projected/1d667330-6af2-4a54-95da-7322f42da901-kube-api-access-44xj2\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.921106 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xrgc\" (UniqueName: \"kubernetes.io/projected/36a4832a-90a2-4b28-a12a-0743b9f65bb6-kube-api-access-4xrgc\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.924578 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05004e62-63f8-47c6-a0ca-28b5f9c8fb08-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.938455 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-2ee3b9cd-975e-4afb-a76e-f09cebae8a5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ee3b9cd-975e-4afb-a76e-f09cebae8a5e\") pod \"ovsdbserver-nb-1\" (UID: \"36a4832a-90a2-4b28-a12a-0743b9f65bb6\") " pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.941123 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-cf67a586-fb3c-4d16-b904-c3534c9adb03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf67a586-fb3c-4d16-b904-c3534c9adb03\") pod \"ovsdbserver-nb-2\" (UID: \"1d667330-6af2-4a54-95da-7322f42da901\") " pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.945412 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-101989e3-e55b-4b04-af43-a4344fef2bc8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-101989e3-e55b-4b04-af43-a4344fef2bc8\") pod \"ovsdbserver-nb-0\" (UID: \"05004e62-63f8-47c6-a0ca-28b5f9c8fb08\") " pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.974587 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.975975 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b16680fa-1b39-4192-b0df-94b79511678a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b16680fa-1b39-4192-b0df-94b79511678a\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976021 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976058 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwldq\" (UniqueName: \"kubernetes.io/projected/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-kube-api-access-lwldq\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976090 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-config\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976147 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c6090344-4b49-4257-b7ee-86f1aeb3535a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6090344-4b49-4257-b7ee-86f1aeb3535a\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976224 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600f88b1-0e72-4a09-a577-dc179d2193e6-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976249 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/600f88b1-0e72-4a09-a577-dc179d2193e6-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976267 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-75f204d3-3837-411e-9046-401292cd7ffa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75f204d3-3837-411e-9046-401292cd7ffa\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976284 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-config\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976303 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976396 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976506 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx77j\" (UniqueName: \"kubernetes.io/projected/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-kube-api-access-mx77j\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976581 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976634 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976661 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976780 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986ls\" (UniqueName: \"kubernetes.io/projected/600f88b1-0e72-4a09-a577-dc179d2193e6-kube-api-access-986ls\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976928 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/600f88b1-0e72-4a09-a577-dc179d2193e6-config\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.976985 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/600f88b1-0e72-4a09-a577-dc179d2193e6-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.989578 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:02 crc kubenswrapper[5200]: I1126 14:57:02.997123 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079416 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/600f88b1-0e72-4a09-a577-dc179d2193e6-config\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079696 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/600f88b1-0e72-4a09-a577-dc179d2193e6-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079732 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b16680fa-1b39-4192-b0df-94b79511678a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b16680fa-1b39-4192-b0df-94b79511678a\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079760 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079789 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwldq\" (UniqueName: \"kubernetes.io/projected/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-kube-api-access-lwldq\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079814 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-config\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079839 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-c6090344-4b49-4257-b7ee-86f1aeb3535a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6090344-4b49-4257-b7ee-86f1aeb3535a\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079906 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600f88b1-0e72-4a09-a577-dc179d2193e6-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079941 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/600f88b1-0e72-4a09-a577-dc179d2193e6-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.079967 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-75f204d3-3837-411e-9046-401292cd7ffa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75f204d3-3837-411e-9046-401292cd7ffa\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.080011 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-config\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.080051 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.080092 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.080132 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx77j\" (UniqueName: \"kubernetes.io/projected/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-kube-api-access-mx77j\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.080154 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.080173 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.080192 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.080251 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-986ls\" (UniqueName: \"kubernetes.io/projected/600f88b1-0e72-4a09-a577-dc179d2193e6-kube-api-access-986ls\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.081295 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/600f88b1-0e72-4a09-a577-dc179d2193e6-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.082535 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-config\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.082746 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/600f88b1-0e72-4a09-a577-dc179d2193e6-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.083073 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.083785 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-config\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.084193 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.084239 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-b16680fa-1b39-4192-b0df-94b79511678a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b16680fa-1b39-4192-b0df-94b79511678a\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/606149aee1e95ec135d6672d5e2565423f08351531df4dfad237bf2c7835e219/globalmount\"" pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.084405 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.084432 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-75f204d3-3837-411e-9046-401292cd7ffa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75f204d3-3837-411e-9046-401292cd7ffa\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0f07da3783b70869df7da60d008b26a3fe3e820a55dd58cd8a08c824cc534c0b/globalmount\"" pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.084758 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.084780 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-c6090344-4b49-4257-b7ee-86f1aeb3535a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6090344-4b49-4257-b7ee-86f1aeb3535a\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0e028aea1e1202a4ddaf660f6411a5c5b879c24913dad3db267d2951ac6defe6/globalmount\"" pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.085128 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/600f88b1-0e72-4a09-a577-dc179d2193e6-config\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.086636 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/600f88b1-0e72-4a09-a577-dc179d2193e6-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.086996 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.087074 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.087231 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.092003 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.093469 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.108800 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-986ls\" (UniqueName: \"kubernetes.io/projected/600f88b1-0e72-4a09-a577-dc179d2193e6-kube-api-access-986ls\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.123801 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwldq\" (UniqueName: \"kubernetes.io/projected/b8227c6b-ba6c-4c71-81bd-06d34798fcc5-kube-api-access-lwldq\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.126828 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx77j\" (UniqueName: \"kubernetes.io/projected/0e8f2732-c3fe-4e7e-b3a3-7563d86ca513-kube-api-access-mx77j\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.155915 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-c6090344-4b49-4257-b7ee-86f1aeb3535a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6090344-4b49-4257-b7ee-86f1aeb3535a\") pod \"ovsdbserver-sb-1\" (UID: \"b8227c6b-ba6c-4c71-81bd-06d34798fcc5\") " pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.166836 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b16680fa-1b39-4192-b0df-94b79511678a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b16680fa-1b39-4192-b0df-94b79511678a\") pod \"ovsdbserver-sb-0\" (UID: \"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513\") " pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.168231 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-75f204d3-3837-411e-9046-401292cd7ffa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75f204d3-3837-411e-9046-401292cd7ffa\") pod \"ovsdbserver-sb-2\" (UID: \"600f88b1-0e72-4a09-a577-dc179d2193e6\") " pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.186320 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.435901 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.467369 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.510723 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.604314 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05004e62-63f8-47c6-a0ca-28b5f9c8fb08","Type":"ContainerStarted","Data":"d39a8ca4501bd361dded09098278b1e5e44db15735fec97ea7a67d2d0138e7a5"} Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.612369 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 26 14:57:03 crc kubenswrapper[5200]: W1126 14:57:03.624457 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d667330_6af2_4a54_95da_7322f42da901.slice/crio-aff550ffbef238b203735d59a87d32c6ae8dfb4bb6b9276769a4e4c2acdac780 WatchSource:0}: Error finding container aff550ffbef238b203735d59a87d32c6ae8dfb4bb6b9276769a4e4c2acdac780: Status 404 returned error can't find the container with id aff550ffbef238b203735d59a87d32c6ae8dfb4bb6b9276769a4e4c2acdac780 Nov 26 14:57:03 crc kubenswrapper[5200]: I1126 14:57:03.698749 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 26 14:57:03 crc kubenswrapper[5200]: W1126 14:57:03.705448 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8227c6b_ba6c_4c71_81bd_06d34798fcc5.slice/crio-74fe12de5a2c90782c2ab1572ab84f5e1371f23800fe1122eab1b8377aec269d WatchSource:0}: Error finding container 74fe12de5a2c90782c2ab1572ab84f5e1371f23800fe1122eab1b8377aec269d: Status 404 returned error can't find the container with id 74fe12de5a2c90782c2ab1572ab84f5e1371f23800fe1122eab1b8377aec269d Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.057147 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.150886 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 14:57:04 crc kubenswrapper[5200]: W1126 14:57:04.159104 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e8f2732_c3fe_4e7e_b3a3_7563d86ca513.slice/crio-cf6745327cc89d8fd29d979e5270fa586c31287766ff08902538c6fad06029ef WatchSource:0}: Error finding container cf6745327cc89d8fd29d979e5270fa586c31287766ff08902538c6fad06029ef: Status 404 returned error can't find the container with id cf6745327cc89d8fd29d979e5270fa586c31287766ff08902538c6fad06029ef Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.513334 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 26 14:57:04 crc kubenswrapper[5200]: W1126 14:57:04.514454 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a4832a_90a2_4b28_a12a_0743b9f65bb6.slice/crio-e2d913642983bcc1812042dabe6fec0afd5a1936c7a63de31f882fe060ad518b WatchSource:0}: Error finding container e2d913642983bcc1812042dabe6fec0afd5a1936c7a63de31f882fe060ad518b: Status 404 returned error can't find the container with id e2d913642983bcc1812042dabe6fec0afd5a1936c7a63de31f882fe060ad518b Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.616052 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"36a4832a-90a2-4b28-a12a-0743b9f65bb6","Type":"ContainerStarted","Data":"e2d913642983bcc1812042dabe6fec0afd5a1936c7a63de31f882fe060ad518b"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.618817 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"600f88b1-0e72-4a09-a577-dc179d2193e6","Type":"ContainerStarted","Data":"3248606df1992e019241abadff0b808ad992fcaa00b83d4dfb9acaaae9100dd8"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.618843 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"600f88b1-0e72-4a09-a577-dc179d2193e6","Type":"ContainerStarted","Data":"8ca9b5a4328524a3c5807d527bb914da30452a81e9810c12f3d6e0d816383b24"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.618852 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"600f88b1-0e72-4a09-a577-dc179d2193e6","Type":"ContainerStarted","Data":"b27ef51fc16db56f70b713c6286868fdb90259552278b5afd767aeff95bc4bde"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.625119 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"1d667330-6af2-4a54-95da-7322f42da901","Type":"ContainerStarted","Data":"61582a6776e0101e6138a44d562eeb57966b079a3c10ffe71ad85f6d5625ab04"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.625155 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"1d667330-6af2-4a54-95da-7322f42da901","Type":"ContainerStarted","Data":"ca230a00e7f6ddd529a8f30f94db9f5c1f0704c626ebe51f2c7c12c72163c337"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.625165 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"1d667330-6af2-4a54-95da-7322f42da901","Type":"ContainerStarted","Data":"aff550ffbef238b203735d59a87d32c6ae8dfb4bb6b9276769a4e4c2acdac780"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.629886 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05004e62-63f8-47c6-a0ca-28b5f9c8fb08","Type":"ContainerStarted","Data":"d3ef78be51eaa7ba9d53b6f5ca432d54566e691d43effb4b37229015eb566bea"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.629920 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05004e62-63f8-47c6-a0ca-28b5f9c8fb08","Type":"ContainerStarted","Data":"258cf3a0410bb1cad80656821ced3fce02da56a0dc840496db10962e96318fda"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.639006 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513","Type":"ContainerStarted","Data":"0e469509b729deea6da8b092a82156ad82355d45711f8ee98f2f1c429236bbb5"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.639050 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513","Type":"ContainerStarted","Data":"c631370f3126a744d5795a8e2510af641b5586968c7a9013e3617101181caa12"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.639062 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0e8f2732-c3fe-4e7e-b3a3-7563d86ca513","Type":"ContainerStarted","Data":"cf6745327cc89d8fd29d979e5270fa586c31287766ff08902538c6fad06029ef"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.642227 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b8227c6b-ba6c-4c71-81bd-06d34798fcc5","Type":"ContainerStarted","Data":"dd55f091909006c8f4785412c8641f6d3ce26e2f36e70c2e84957b7724abff1d"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.642265 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b8227c6b-ba6c-4c71-81bd-06d34798fcc5","Type":"ContainerStarted","Data":"7fd24649d16b833380c6555e7ad64005b96d157dc228d44df4355c72f8c539c3"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.642276 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b8227c6b-ba6c-4c71-81bd-06d34798fcc5","Type":"ContainerStarted","Data":"74fe12de5a2c90782c2ab1572ab84f5e1371f23800fe1122eab1b8377aec269d"} Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.643415 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.643383417 podStartE2EDuration="3.643383417s" podCreationTimestamp="2025-11-26 14:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:04.63667456 +0000 UTC m=+5193.315876418" watchObservedRunningTime="2025-11-26 14:57:04.643383417 +0000 UTC m=+5193.322585235" Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.665366 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.665350176 podStartE2EDuration="3.665350176s" podCreationTimestamp="2025-11-26 14:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:04.658657299 +0000 UTC m=+5193.337859117" watchObservedRunningTime="2025-11-26 14:57:04.665350176 +0000 UTC m=+5193.344551994" Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.681794 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.681774679 podStartE2EDuration="3.681774679s" podCreationTimestamp="2025-11-26 14:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:04.680362602 +0000 UTC m=+5193.359564420" watchObservedRunningTime="2025-11-26 14:57:04.681774679 +0000 UTC m=+5193.360976527" Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.702916 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.702892436 podStartE2EDuration="3.702892436s" podCreationTimestamp="2025-11-26 14:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:04.696150648 +0000 UTC m=+5193.375352456" watchObservedRunningTime="2025-11-26 14:57:04.702892436 +0000 UTC m=+5193.382094254" Nov 26 14:57:04 crc kubenswrapper[5200]: I1126 14:57:04.716736 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.716722551 podStartE2EDuration="3.716722551s" podCreationTimestamp="2025-11-26 14:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:04.713377483 +0000 UTC m=+5193.392579301" watchObservedRunningTime="2025-11-26 14:57:04.716722551 +0000 UTC m=+5193.395924369" Nov 26 14:57:05 crc kubenswrapper[5200]: I1126 14:57:05.660838 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"36a4832a-90a2-4b28-a12a-0743b9f65bb6","Type":"ContainerStarted","Data":"87c7938d756f4fb016464d2f1b453cdf2e6370f8c91dbd072c871f21d158db92"} Nov 26 14:57:05 crc kubenswrapper[5200]: I1126 14:57:05.661353 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"36a4832a-90a2-4b28-a12a-0743b9f65bb6","Type":"ContainerStarted","Data":"0305a9508a1f8422a249d617012f110cf3ff3a853c50591ebfd652b7e8cb9bcb"} Nov 26 14:57:05 crc kubenswrapper[5200]: I1126 14:57:05.975098 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:05 crc kubenswrapper[5200]: I1126 14:57:05.990016 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:05 crc kubenswrapper[5200]: I1126 14:57:05.999199 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:06 crc kubenswrapper[5200]: I1126 14:57:06.186987 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:06 crc kubenswrapper[5200]: I1126 14:57:06.230939 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:06 crc kubenswrapper[5200]: I1126 14:57:06.258317 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=5.258294985 podStartE2EDuration="5.258294985s" podCreationTimestamp="2025-11-26 14:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:05.685675964 +0000 UTC m=+5194.364877822" watchObservedRunningTime="2025-11-26 14:57:06.258294985 +0000 UTC m=+5194.937496813" Nov 26 14:57:06 crc kubenswrapper[5200]: I1126 14:57:06.438015 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:06 crc kubenswrapper[5200]: I1126 14:57:06.468322 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:06 crc kubenswrapper[5200]: I1126 14:57:06.680388 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:07 crc kubenswrapper[5200]: I1126 14:57:07.975134 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:07 crc kubenswrapper[5200]: I1126 14:57:07.989840 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:07 crc kubenswrapper[5200]: I1126 14:57:07.999560 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:08 crc kubenswrapper[5200]: I1126 14:57:08.436441 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:08 crc kubenswrapper[5200]: I1126 14:57:08.467661 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:08 crc kubenswrapper[5200]: I1126 14:57:08.969621 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:57:08 crc kubenswrapper[5200]: E1126 14:57:08.970019 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.019210 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.034636 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.041806 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.089416 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.104112 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.321523 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d9b56d8b9-5cqwc"] Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.333965 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.337369 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovsdbserver-nb\"" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.344044 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9b56d8b9-5cqwc"] Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.419245 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7htjb\" (UniqueName: \"kubernetes.io/projected/c5638afe-13bb-4b94-8d68-e45b6d57abcf-kube-api-access-7htjb\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.419321 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.419469 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-dns-svc\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.419545 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-config\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.475677 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.508389 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.516434 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.521454 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.521555 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-dns-svc\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.521652 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-config\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.521771 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7htjb\" (UniqueName: \"kubernetes.io/projected/c5638afe-13bb-4b94-8d68-e45b6d57abcf-kube-api-access-7htjb\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.522903 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-dns-svc\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.522966 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-config\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.523422 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.543183 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7htjb\" (UniqueName: \"kubernetes.io/projected/c5638afe-13bb-4b94-8d68-e45b6d57abcf-kube-api-access-7htjb\") pod \"dnsmasq-dns-5d9b56d8b9-5cqwc\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.569279 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.658355 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.758764 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9b56d8b9-5cqwc"] Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.826668 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.832051 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-655b9d9769-v5czh"] Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.847948 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-655b9d9769-v5czh"] Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.848101 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:09 crc kubenswrapper[5200]: I1126 14:57:09.850971 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovsdbserver-sb\"" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.033750 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxfbl\" (UniqueName: \"kubernetes.io/projected/46561b5f-ed2b-4ac6-924b-9feb39192c8a-kube-api-access-gxfbl\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.033806 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.033829 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-dns-svc\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.033876 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.033913 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-config\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.135587 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.135671 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-config\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.136613 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.136632 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-config\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.137214 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxfbl\" (UniqueName: \"kubernetes.io/projected/46561b5f-ed2b-4ac6-924b-9feb39192c8a-kube-api-access-gxfbl\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.137272 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.137302 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-dns-svc\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.137889 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-dns-svc\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.137930 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.163555 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxfbl\" (UniqueName: \"kubernetes.io/projected/46561b5f-ed2b-4ac6-924b-9feb39192c8a-kube-api-access-gxfbl\") pod \"dnsmasq-dns-655b9d9769-v5czh\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.183063 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.329349 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9b56d8b9-5cqwc"] Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.607879 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-655b9d9769-v5czh"] Nov 26 14:57:10 crc kubenswrapper[5200]: W1126 14:57:10.613316 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46561b5f_ed2b_4ac6_924b_9feb39192c8a.slice/crio-98b8c0383c62bd481346d9548d3343e07b605beaea844679fea3a531ab79865f WatchSource:0}: Error finding container 98b8c0383c62bd481346d9548d3343e07b605beaea844679fea3a531ab79865f: Status 404 returned error can't find the container with id 98b8c0383c62bd481346d9548d3343e07b605beaea844679fea3a531ab79865f Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.718530 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" event={"ID":"46561b5f-ed2b-4ac6-924b-9feb39192c8a","Type":"ContainerStarted","Data":"98b8c0383c62bd481346d9548d3343e07b605beaea844679fea3a531ab79865f"} Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.721863 5200 generic.go:358] "Generic (PLEG): container finished" podID="c5638afe-13bb-4b94-8d68-e45b6d57abcf" containerID="77cbf5bdf12ec4cda719b996f3352d9b9d66b5b18be534d3a2c79428d0f8da6d" exitCode=0 Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.721901 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" event={"ID":"c5638afe-13bb-4b94-8d68-e45b6d57abcf","Type":"ContainerDied","Data":"77cbf5bdf12ec4cda719b996f3352d9b9d66b5b18be534d3a2c79428d0f8da6d"} Nov 26 14:57:10 crc kubenswrapper[5200]: I1126 14:57:10.722080 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" event={"ID":"c5638afe-13bb-4b94-8d68-e45b6d57abcf","Type":"ContainerStarted","Data":"2d2b6b79eecb5b0044f5c40a13032aaa91fbd08d2f6ab15dafe28483e2c68d5c"} Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.000115 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.153291 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-ovsdbserver-nb\") pod \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.153660 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-dns-svc\") pod \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.153896 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-config\") pod \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.153930 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7htjb\" (UniqueName: \"kubernetes.io/projected/c5638afe-13bb-4b94-8d68-e45b6d57abcf-kube-api-access-7htjb\") pod \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\" (UID: \"c5638afe-13bb-4b94-8d68-e45b6d57abcf\") " Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.158626 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5638afe-13bb-4b94-8d68-e45b6d57abcf-kube-api-access-7htjb" (OuterVolumeSpecName: "kube-api-access-7htjb") pod "c5638afe-13bb-4b94-8d68-e45b6d57abcf" (UID: "c5638afe-13bb-4b94-8d68-e45b6d57abcf"). InnerVolumeSpecName "kube-api-access-7htjb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.174804 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5638afe-13bb-4b94-8d68-e45b6d57abcf" (UID: "c5638afe-13bb-4b94-8d68-e45b6d57abcf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.175812 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-config" (OuterVolumeSpecName: "config") pod "c5638afe-13bb-4b94-8d68-e45b6d57abcf" (UID: "c5638afe-13bb-4b94-8d68-e45b6d57abcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.176249 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5638afe-13bb-4b94-8d68-e45b6d57abcf" (UID: "c5638afe-13bb-4b94-8d68-e45b6d57abcf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.256016 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.256256 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.256321 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5638afe-13bb-4b94-8d68-e45b6d57abcf-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.256416 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7htjb\" (UniqueName: \"kubernetes.io/projected/c5638afe-13bb-4b94-8d68-e45b6d57abcf-kube-api-access-7htjb\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.732408 5200 generic.go:358] "Generic (PLEG): container finished" podID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" containerID="89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d" exitCode=0 Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.732512 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" event={"ID":"46561b5f-ed2b-4ac6-924b-9feb39192c8a","Type":"ContainerDied","Data":"89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d"} Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.736053 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.736177 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9b56d8b9-5cqwc" event={"ID":"c5638afe-13bb-4b94-8d68-e45b6d57abcf","Type":"ContainerDied","Data":"2d2b6b79eecb5b0044f5c40a13032aaa91fbd08d2f6ab15dafe28483e2c68d5c"} Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.736226 5200 scope.go:117] "RemoveContainer" containerID="77cbf5bdf12ec4cda719b996f3352d9b9d66b5b18be534d3a2c79428d0f8da6d" Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.974518 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9b56d8b9-5cqwc"] Nov 26 14:57:11 crc kubenswrapper[5200]: I1126 14:57:11.983273 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d9b56d8b9-5cqwc"] Nov 26 14:57:12 crc kubenswrapper[5200]: I1126 14:57:12.728582 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Nov 26 14:57:12 crc kubenswrapper[5200]: I1126 14:57:12.747012 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" event={"ID":"46561b5f-ed2b-4ac6-924b-9feb39192c8a","Type":"ContainerStarted","Data":"1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e"} Nov 26 14:57:12 crc kubenswrapper[5200]: I1126 14:57:12.747125 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:12 crc kubenswrapper[5200]: I1126 14:57:12.773074 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" podStartSLOduration=3.773056552 podStartE2EDuration="3.773056552s" podCreationTimestamp="2025-11-26 14:57:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:12.766807467 +0000 UTC m=+5201.446009305" watchObservedRunningTime="2025-11-26 14:57:12.773056552 +0000 UTC m=+5201.452258370" Nov 26 14:57:12 crc kubenswrapper[5200]: I1126 14:57:12.984454 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5638afe-13bb-4b94-8d68-e45b6d57abcf" path="/var/lib/kubelet/pods/c5638afe-13bb-4b94-8d68-e45b6d57abcf/volumes" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.127724 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.129420 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5638afe-13bb-4b94-8d68-e45b6d57abcf" containerName="init" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.129447 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5638afe-13bb-4b94-8d68-e45b6d57abcf" containerName="init" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.129627 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5638afe-13bb-4b94-8d68-e45b6d57abcf" containerName="init" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.137997 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.141589 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovn-data-cert\"" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.143850 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.287703 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-856a6966-322f-4462-b2fc-01a35ef27178\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.287813 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt9q9\" (UniqueName: \"kubernetes.io/projected/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-kube-api-access-pt9q9\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.287834 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.390554 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt9q9\" (UniqueName: \"kubernetes.io/projected/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-kube-api-access-pt9q9\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.390652 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.390933 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-856a6966-322f-4462-b2fc-01a35ef27178\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.394776 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.395398 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-856a6966-322f-4462-b2fc-01a35ef27178\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/170037132ca70bb4b6de41a708235c69812ee3e5da23c465960410d7f0b34eb1/globalmount\"" pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.397810 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.410168 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt9q9\" (UniqueName: \"kubernetes.io/projected/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-kube-api-access-pt9q9\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.424811 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-856a6966-322f-4462-b2fc-01a35ef27178\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178\") pod \"ovn-copy-data\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.468043 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 26 14:57:16 crc kubenswrapper[5200]: I1126 14:57:16.956432 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Nov 26 14:57:17 crc kubenswrapper[5200]: I1126 14:57:17.788976 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff","Type":"ContainerStarted","Data":"c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86"} Nov 26 14:57:17 crc kubenswrapper[5200]: I1126 14:57:17.789317 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff","Type":"ContainerStarted","Data":"619d52b0308db8414497f3cd66903ddc4ef97a0cd9537664b42beb7c3916bfa4"} Nov 26 14:57:17 crc kubenswrapper[5200]: I1126 14:57:17.802869 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.802834076 podStartE2EDuration="3.802834076s" podCreationTimestamp="2025-11-26 14:57:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:17.800852384 +0000 UTC m=+5206.480054212" watchObservedRunningTime="2025-11-26 14:57:17.802834076 +0000 UTC m=+5206.482035904" Nov 26 14:57:18 crc kubenswrapper[5200]: I1126 14:57:18.759619 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:18 crc kubenswrapper[5200]: I1126 14:57:18.807471 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c584ff65-fbljb"] Nov 26 14:57:18 crc kubenswrapper[5200]: I1126 14:57:18.807785 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" podUID="21fdd50d-227d-4dac-9f8e-71192a857118" containerName="dnsmasq-dns" containerID="cri-o://ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7" gracePeriod=10 Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.287555 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.343602 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss7ph\" (UniqueName: \"kubernetes.io/projected/21fdd50d-227d-4dac-9f8e-71192a857118-kube-api-access-ss7ph\") pod \"21fdd50d-227d-4dac-9f8e-71192a857118\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.343845 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-config\") pod \"21fdd50d-227d-4dac-9f8e-71192a857118\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.343957 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-dns-svc\") pod \"21fdd50d-227d-4dac-9f8e-71192a857118\" (UID: \"21fdd50d-227d-4dac-9f8e-71192a857118\") " Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.349747 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21fdd50d-227d-4dac-9f8e-71192a857118-kube-api-access-ss7ph" (OuterVolumeSpecName: "kube-api-access-ss7ph") pod "21fdd50d-227d-4dac-9f8e-71192a857118" (UID: "21fdd50d-227d-4dac-9f8e-71192a857118"). InnerVolumeSpecName "kube-api-access-ss7ph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.387663 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-config" (OuterVolumeSpecName: "config") pod "21fdd50d-227d-4dac-9f8e-71192a857118" (UID: "21fdd50d-227d-4dac-9f8e-71192a857118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.400096 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21fdd50d-227d-4dac-9f8e-71192a857118" (UID: "21fdd50d-227d-4dac-9f8e-71192a857118"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.446732 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.446786 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss7ph\" (UniqueName: \"kubernetes.io/projected/21fdd50d-227d-4dac-9f8e-71192a857118-kube-api-access-ss7ph\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.446806 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21fdd50d-227d-4dac-9f8e-71192a857118-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.809569 5200 generic.go:358] "Generic (PLEG): container finished" podID="21fdd50d-227d-4dac-9f8e-71192a857118" containerID="ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7" exitCode=0 Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.809754 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.809766 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" event={"ID":"21fdd50d-227d-4dac-9f8e-71192a857118","Type":"ContainerDied","Data":"ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7"} Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.809990 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c584ff65-fbljb" event={"ID":"21fdd50d-227d-4dac-9f8e-71192a857118","Type":"ContainerDied","Data":"6169f5bc21ff14db0c722e17cdad40c184c73c46f0c39900cc0b110b8f4b953c"} Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.810035 5200 scope.go:117] "RemoveContainer" containerID="ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.835107 5200 scope.go:117] "RemoveContainer" containerID="ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.859900 5200 scope.go:117] "RemoveContainer" containerID="ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7" Nov 26 14:57:19 crc kubenswrapper[5200]: E1126 14:57:19.860340 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7\": container with ID starting with ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7 not found: ID does not exist" containerID="ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.860395 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7"} err="failed to get container status \"ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7\": rpc error: code = NotFound desc = could not find container \"ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7\": container with ID starting with ab433b187bf4bc5df5ddb1e0bb2d58f826ba2a2d8d35fb75264c3921b5672da7 not found: ID does not exist" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.860423 5200 scope.go:117] "RemoveContainer" containerID="ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48" Nov 26 14:57:19 crc kubenswrapper[5200]: E1126 14:57:19.860878 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48\": container with ID starting with ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48 not found: ID does not exist" containerID="ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.860911 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48"} err="failed to get container status \"ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48\": rpc error: code = NotFound desc = could not find container \"ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48\": container with ID starting with ee39b263bea014cdb76aac1283f8eddc021a186ec2412aeedcf58dc5438c5d48 not found: ID does not exist" Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.861404 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c584ff65-fbljb"] Nov 26 14:57:19 crc kubenswrapper[5200]: I1126 14:57:19.867293 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77c584ff65-fbljb"] Nov 26 14:57:20 crc kubenswrapper[5200]: I1126 14:57:20.980309 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21fdd50d-227d-4dac-9f8e-71192a857118" path="/var/lib/kubelet/pods/21fdd50d-227d-4dac-9f8e-71192a857118/volumes" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.875816 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.877209 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21fdd50d-227d-4dac-9f8e-71192a857118" containerName="init" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.877227 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="21fdd50d-227d-4dac-9f8e-71192a857118" containerName="init" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.877247 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21fdd50d-227d-4dac-9f8e-71192a857118" containerName="dnsmasq-dns" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.877254 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="21fdd50d-227d-4dac-9f8e-71192a857118" containerName="dnsmasq-dns" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.877439 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="21fdd50d-227d-4dac-9f8e-71192a857118" containerName="dnsmasq-dns" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.893729 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.893893 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.897374 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovnnorthd-ovnnorthd-dockercfg-gnqj9\"" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.897411 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovnnorthd-config\"" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.897412 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovnnorthd-scripts\"" Nov 26 14:57:22 crc kubenswrapper[5200]: I1126 14:57:22.976134 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:57:22 crc kubenswrapper[5200]: E1126 14:57:22.976475 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.011356 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gxgp\" (UniqueName: \"kubernetes.io/projected/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-kube-api-access-5gxgp\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.011450 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-config\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.011767 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.011871 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-scripts\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.011920 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.114133 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gxgp\" (UniqueName: \"kubernetes.io/projected/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-kube-api-access-5gxgp\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.114319 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-config\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.114409 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.114448 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-scripts\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.114506 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.115432 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.115533 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-config\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.115581 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-scripts\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.131972 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.141966 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gxgp\" (UniqueName: \"kubernetes.io/projected/659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d-kube-api-access-5gxgp\") pod \"ovn-northd-0\" (UID: \"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d\") " pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.224542 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 14:57:23 crc kubenswrapper[5200]: W1126 14:57:23.658899 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod659d0a2a_3ae4_4fcf_850e_37e4a6f2df4d.slice/crio-adae86af1dfc590f55253a08af85b4d8f076a51aab2569db008bdd94a01e9baf WatchSource:0}: Error finding container adae86af1dfc590f55253a08af85b4d8f076a51aab2569db008bdd94a01e9baf: Status 404 returned error can't find the container with id adae86af1dfc590f55253a08af85b4d8f076a51aab2569db008bdd94a01e9baf Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.658906 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.855306 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d","Type":"ContainerStarted","Data":"5e77b7bf69dc19283dcfefed4e5e1701cd340f6d34db026085d1041c82210e54"} Nov 26 14:57:23 crc kubenswrapper[5200]: I1126 14:57:23.855724 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d","Type":"ContainerStarted","Data":"adae86af1dfc590f55253a08af85b4d8f076a51aab2569db008bdd94a01e9baf"} Nov 26 14:57:24 crc kubenswrapper[5200]: I1126 14:57:24.864890 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d","Type":"ContainerStarted","Data":"2865adbe039cde40c87984d5f0c11b1bab8fb56f2c3c1b8f37acf1a20f518839"} Nov 26 14:57:24 crc kubenswrapper[5200]: I1126 14:57:24.866598 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-northd-0" Nov 26 14:57:24 crc kubenswrapper[5200]: I1126 14:57:24.894329 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.8943000100000003 podStartE2EDuration="2.89430001s" podCreationTimestamp="2025-11-26 14:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:24.889451483 +0000 UTC m=+5213.568653311" watchObservedRunningTime="2025-11-26 14:57:24.89430001 +0000 UTC m=+5213.573501858" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.578540 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-cwzk4"] Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.640108 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cwzk4"] Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.640233 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.686238 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-a6e0-account-create-update-976w8"] Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.695013 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.695432 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a6e0-account-create-update-976w8"] Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.703305 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-db-secret\"" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.726282 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nff\" (UniqueName: \"kubernetes.io/projected/29d6650b-66a6-4341-b458-ada349675a6a-kube-api-access-48nff\") pod \"keystone-db-create-cwzk4\" (UID: \"29d6650b-66a6-4341-b458-ada349675a6a\") " pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.726385 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d6650b-66a6-4341-b458-ada349675a6a-operator-scripts\") pod \"keystone-db-create-cwzk4\" (UID: \"29d6650b-66a6-4341-b458-ada349675a6a\") " pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.827908 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-operator-scripts\") pod \"keystone-a6e0-account-create-update-976w8\" (UID: \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\") " pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.828009 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d6650b-66a6-4341-b458-ada349675a6a-operator-scripts\") pod \"keystone-db-create-cwzk4\" (UID: \"29d6650b-66a6-4341-b458-ada349675a6a\") " pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.828066 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9x4f\" (UniqueName: \"kubernetes.io/projected/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-kube-api-access-f9x4f\") pod \"keystone-a6e0-account-create-update-976w8\" (UID: \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\") " pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.828201 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48nff\" (UniqueName: \"kubernetes.io/projected/29d6650b-66a6-4341-b458-ada349675a6a-kube-api-access-48nff\") pod \"keystone-db-create-cwzk4\" (UID: \"29d6650b-66a6-4341-b458-ada349675a6a\") " pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.829837 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d6650b-66a6-4341-b458-ada349675a6a-operator-scripts\") pod \"keystone-db-create-cwzk4\" (UID: \"29d6650b-66a6-4341-b458-ada349675a6a\") " pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.849179 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nff\" (UniqueName: \"kubernetes.io/projected/29d6650b-66a6-4341-b458-ada349675a6a-kube-api-access-48nff\") pod \"keystone-db-create-cwzk4\" (UID: \"29d6650b-66a6-4341-b458-ada349675a6a\") " pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.931374 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-operator-scripts\") pod \"keystone-a6e0-account-create-update-976w8\" (UID: \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\") " pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.931494 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9x4f\" (UniqueName: \"kubernetes.io/projected/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-kube-api-access-f9x4f\") pod \"keystone-a6e0-account-create-update-976w8\" (UID: \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\") " pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.932332 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-operator-scripts\") pod \"keystone-a6e0-account-create-update-976w8\" (UID: \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\") " pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.952561 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9x4f\" (UniqueName: \"kubernetes.io/projected/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-kube-api-access-f9x4f\") pod \"keystone-a6e0-account-create-update-976w8\" (UID: \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\") " pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:27 crc kubenswrapper[5200]: I1126 14:57:27.958460 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:28 crc kubenswrapper[5200]: I1126 14:57:28.012269 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:28 crc kubenswrapper[5200]: I1126 14:57:28.409022 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cwzk4"] Nov 26 14:57:28 crc kubenswrapper[5200]: I1126 14:57:28.478659 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a6e0-account-create-update-976w8"] Nov 26 14:57:28 crc kubenswrapper[5200]: I1126 14:57:28.899356 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a6e0-account-create-update-976w8" event={"ID":"9d9bcfe0-f199-4c6c-8317-6beab93a44a4","Type":"ContainerStarted","Data":"e63d46ec4f4eeacb453324a2dbf8ad058535b8e7b5379224415b65ed9edf3a08"} Nov 26 14:57:28 crc kubenswrapper[5200]: I1126 14:57:28.899401 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a6e0-account-create-update-976w8" event={"ID":"9d9bcfe0-f199-4c6c-8317-6beab93a44a4","Type":"ContainerStarted","Data":"afa19839515da28f77f6921a5c4647f3c278cf40c8d6a232ff63d1cdbdee17b2"} Nov 26 14:57:28 crc kubenswrapper[5200]: I1126 14:57:28.901318 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cwzk4" event={"ID":"29d6650b-66a6-4341-b458-ada349675a6a","Type":"ContainerStarted","Data":"ec62efc77353b3b48cb443a7bfaa527aef9aee00631706174ff25b42d3925b8a"} Nov 26 14:57:28 crc kubenswrapper[5200]: I1126 14:57:28.901348 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cwzk4" event={"ID":"29d6650b-66a6-4341-b458-ada349675a6a","Type":"ContainerStarted","Data":"98ba83d9ae3c2937089fda4e2765157d714082a7dacf08bcb2187dcd910bad04"} Nov 26 14:57:28 crc kubenswrapper[5200]: I1126 14:57:28.914853 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-a6e0-account-create-update-976w8" podStartSLOduration=1.914826248 podStartE2EDuration="1.914826248s" podCreationTimestamp="2025-11-26 14:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:28.913424872 +0000 UTC m=+5217.592626700" watchObservedRunningTime="2025-11-26 14:57:28.914826248 +0000 UTC m=+5217.594028066" Nov 26 14:57:29 crc kubenswrapper[5200]: I1126 14:57:29.913524 5200 generic.go:358] "Generic (PLEG): container finished" podID="9d9bcfe0-f199-4c6c-8317-6beab93a44a4" containerID="e63d46ec4f4eeacb453324a2dbf8ad058535b8e7b5379224415b65ed9edf3a08" exitCode=0 Nov 26 14:57:29 crc kubenswrapper[5200]: I1126 14:57:29.913666 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a6e0-account-create-update-976w8" event={"ID":"9d9bcfe0-f199-4c6c-8317-6beab93a44a4","Type":"ContainerDied","Data":"e63d46ec4f4eeacb453324a2dbf8ad058535b8e7b5379224415b65ed9edf3a08"} Nov 26 14:57:29 crc kubenswrapper[5200]: I1126 14:57:29.919018 5200 generic.go:358] "Generic (PLEG): container finished" podID="29d6650b-66a6-4341-b458-ada349675a6a" containerID="ec62efc77353b3b48cb443a7bfaa527aef9aee00631706174ff25b42d3925b8a" exitCode=0 Nov 26 14:57:29 crc kubenswrapper[5200]: I1126 14:57:29.919210 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cwzk4" event={"ID":"29d6650b-66a6-4341-b458-ada349675a6a","Type":"ContainerDied","Data":"ec62efc77353b3b48cb443a7bfaa527aef9aee00631706174ff25b42d3925b8a"} Nov 26 14:57:29 crc kubenswrapper[5200]: I1126 14:57:29.934977 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-cwzk4" podStartSLOduration=2.93495071 podStartE2EDuration="2.93495071s" podCreationTimestamp="2025-11-26 14:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:28.93726122 +0000 UTC m=+5217.616463038" watchObservedRunningTime="2025-11-26 14:57:29.93495071 +0000 UTC m=+5218.614152538" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.335203 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.342398 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.507033 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-operator-scripts\") pod \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\" (UID: \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\") " Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.507098 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9x4f\" (UniqueName: \"kubernetes.io/projected/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-kube-api-access-f9x4f\") pod \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\" (UID: \"9d9bcfe0-f199-4c6c-8317-6beab93a44a4\") " Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.507209 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48nff\" (UniqueName: \"kubernetes.io/projected/29d6650b-66a6-4341-b458-ada349675a6a-kube-api-access-48nff\") pod \"29d6650b-66a6-4341-b458-ada349675a6a\" (UID: \"29d6650b-66a6-4341-b458-ada349675a6a\") " Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.507406 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d6650b-66a6-4341-b458-ada349675a6a-operator-scripts\") pod \"29d6650b-66a6-4341-b458-ada349675a6a\" (UID: \"29d6650b-66a6-4341-b458-ada349675a6a\") " Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.508192 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d9bcfe0-f199-4c6c-8317-6beab93a44a4" (UID: "9d9bcfe0-f199-4c6c-8317-6beab93a44a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.508332 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d6650b-66a6-4341-b458-ada349675a6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29d6650b-66a6-4341-b458-ada349675a6a" (UID: "29d6650b-66a6-4341-b458-ada349675a6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.508571 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d6650b-66a6-4341-b458-ada349675a6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.508591 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.514187 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-kube-api-access-f9x4f" (OuterVolumeSpecName: "kube-api-access-f9x4f") pod "9d9bcfe0-f199-4c6c-8317-6beab93a44a4" (UID: "9d9bcfe0-f199-4c6c-8317-6beab93a44a4"). InnerVolumeSpecName "kube-api-access-f9x4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.516124 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d6650b-66a6-4341-b458-ada349675a6a-kube-api-access-48nff" (OuterVolumeSpecName: "kube-api-access-48nff") pod "29d6650b-66a6-4341-b458-ada349675a6a" (UID: "29d6650b-66a6-4341-b458-ada349675a6a"). InnerVolumeSpecName "kube-api-access-48nff". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.610232 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f9x4f\" (UniqueName: \"kubernetes.io/projected/9d9bcfe0-f199-4c6c-8317-6beab93a44a4-kube-api-access-f9x4f\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.610270 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-48nff\" (UniqueName: \"kubernetes.io/projected/29d6650b-66a6-4341-b458-ada349675a6a-kube-api-access-48nff\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.950365 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cwzk4" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.950489 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cwzk4" event={"ID":"29d6650b-66a6-4341-b458-ada349675a6a","Type":"ContainerDied","Data":"98ba83d9ae3c2937089fda4e2765157d714082a7dacf08bcb2187dcd910bad04"} Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.950537 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ba83d9ae3c2937089fda4e2765157d714082a7dacf08bcb2187dcd910bad04" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.956566 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a6e0-account-create-update-976w8" event={"ID":"9d9bcfe0-f199-4c6c-8317-6beab93a44a4","Type":"ContainerDied","Data":"afa19839515da28f77f6921a5c4647f3c278cf40c8d6a232ff63d1cdbdee17b2"} Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.956613 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afa19839515da28f77f6921a5c4647f3c278cf40c8d6a232ff63d1cdbdee17b2" Nov 26 14:57:31 crc kubenswrapper[5200]: I1126 14:57:31.956676 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a6e0-account-create-update-976w8" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.194872 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qctzs"] Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.199438 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d9bcfe0-f199-4c6c-8317-6beab93a44a4" containerName="mariadb-account-create-update" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.199485 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9bcfe0-f199-4c6c-8317-6beab93a44a4" containerName="mariadb-account-create-update" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.199546 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29d6650b-66a6-4341-b458-ada349675a6a" containerName="mariadb-database-create" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.199555 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d6650b-66a6-4341-b458-ada349675a6a" containerName="mariadb-database-create" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.199779 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d9bcfe0-f199-4c6c-8317-6beab93a44a4" containerName="mariadb-account-create-update" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.199796 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="29d6650b-66a6-4341-b458-ada349675a6a" containerName="mariadb-database-create" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.235079 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qctzs"] Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.235202 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.237742 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.237937 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.239740 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-ktxkv\"" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.241947 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.340832 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bkxt\" (UniqueName: \"kubernetes.io/projected/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-kube-api-access-8bkxt\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.340948 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-config-data\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.341470 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-combined-ca-bundle\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.443418 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-combined-ca-bundle\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.443521 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bkxt\" (UniqueName: \"kubernetes.io/projected/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-kube-api-access-8bkxt\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.443599 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-config-data\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.449223 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-config-data\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.449476 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-combined-ca-bundle\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.466934 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bkxt\" (UniqueName: \"kubernetes.io/projected/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-kube-api-access-8bkxt\") pod \"keystone-db-sync-qctzs\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:33 crc kubenswrapper[5200]: I1126 14:57:33.561370 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:34 crc kubenswrapper[5200]: I1126 14:57:34.108433 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qctzs"] Nov 26 14:57:34 crc kubenswrapper[5200]: W1126 14:57:34.123773 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3df158a9_9d0e_486f_a9e0_d9fe18c5993b.slice/crio-7bddc58c291d840067cc849be7d76ea0b3080de12c133d40c3acaf00a1d452f6 WatchSource:0}: Error finding container 7bddc58c291d840067cc849be7d76ea0b3080de12c133d40c3acaf00a1d452f6: Status 404 returned error can't find the container with id 7bddc58c291d840067cc849be7d76ea0b3080de12c133d40c3acaf00a1d452f6 Nov 26 14:57:34 crc kubenswrapper[5200]: I1126 14:57:34.969264 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:57:34 crc kubenswrapper[5200]: E1126 14:57:34.969546 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:57:34 crc kubenswrapper[5200]: I1126 14:57:34.985482 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qctzs" event={"ID":"3df158a9-9d0e-486f-a9e0-d9fe18c5993b","Type":"ContainerStarted","Data":"deeb37e98121744b207b8aaa4e6b9e9fc0f058894aa41541f3f261a7821aea76"} Nov 26 14:57:34 crc kubenswrapper[5200]: I1126 14:57:34.985527 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qctzs" event={"ID":"3df158a9-9d0e-486f-a9e0-d9fe18c5993b","Type":"ContainerStarted","Data":"7bddc58c291d840067cc849be7d76ea0b3080de12c133d40c3acaf00a1d452f6"} Nov 26 14:57:35 crc kubenswrapper[5200]: I1126 14:57:35.021891 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qctzs" podStartSLOduration=2.021872212 podStartE2EDuration="2.021872212s" podCreationTimestamp="2025-11-26 14:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:35.01382539 +0000 UTC m=+5223.693027208" watchObservedRunningTime="2025-11-26 14:57:35.021872212 +0000 UTC m=+5223.701074030" Nov 26 14:57:36 crc kubenswrapper[5200]: I1126 14:57:36.955610 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 26 14:57:37 crc kubenswrapper[5200]: I1126 14:57:37.016663 5200 generic.go:358] "Generic (PLEG): container finished" podID="3df158a9-9d0e-486f-a9e0-d9fe18c5993b" containerID="deeb37e98121744b207b8aaa4e6b9e9fc0f058894aa41541f3f261a7821aea76" exitCode=0 Nov 26 14:57:37 crc kubenswrapper[5200]: I1126 14:57:37.016804 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qctzs" event={"ID":"3df158a9-9d0e-486f-a9e0-d9fe18c5993b","Type":"ContainerDied","Data":"deeb37e98121744b207b8aaa4e6b9e9fc0f058894aa41541f3f261a7821aea76"} Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.419121 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.533453 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-config-data\") pod \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.533542 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bkxt\" (UniqueName: \"kubernetes.io/projected/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-kube-api-access-8bkxt\") pod \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.533570 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-combined-ca-bundle\") pod \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\" (UID: \"3df158a9-9d0e-486f-a9e0-d9fe18c5993b\") " Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.573527 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-kube-api-access-8bkxt" (OuterVolumeSpecName: "kube-api-access-8bkxt") pod "3df158a9-9d0e-486f-a9e0-d9fe18c5993b" (UID: "3df158a9-9d0e-486f-a9e0-d9fe18c5993b"). InnerVolumeSpecName "kube-api-access-8bkxt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.590411 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3df158a9-9d0e-486f-a9e0-d9fe18c5993b" (UID: "3df158a9-9d0e-486f-a9e0-d9fe18c5993b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.613040 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-config-data" (OuterVolumeSpecName: "config-data") pod "3df158a9-9d0e-486f-a9e0-d9fe18c5993b" (UID: "3df158a9-9d0e-486f-a9e0-d9fe18c5993b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.635497 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.635586 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8bkxt\" (UniqueName: \"kubernetes.io/projected/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-kube-api-access-8bkxt\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:38 crc kubenswrapper[5200]: I1126 14:57:38.635602 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df158a9-9d0e-486f-a9e0-d9fe18c5993b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.038775 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qctzs" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.038749 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qctzs" event={"ID":"3df158a9-9d0e-486f-a9e0-d9fe18c5993b","Type":"ContainerDied","Data":"7bddc58c291d840067cc849be7d76ea0b3080de12c133d40c3acaf00a1d452f6"} Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.038931 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bddc58c291d840067cc849be7d76ea0b3080de12c133d40c3acaf00a1d452f6" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.311528 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d4bf4bdcc-4k76c"] Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.312859 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3df158a9-9d0e-486f-a9e0-d9fe18c5993b" containerName="keystone-db-sync" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.312882 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df158a9-9d0e-486f-a9e0-d9fe18c5993b" containerName="keystone-db-sync" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.313076 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3df158a9-9d0e-486f-a9e0-d9fe18c5993b" containerName="keystone-db-sync" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.324453 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d4bf4bdcc-4k76c"] Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.325335 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.327880 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xllfc"] Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.348635 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbf5n\" (UniqueName: \"kubernetes.io/projected/fbd62451-f99b-4030-a542-d749bbff084d-kube-api-access-hbf5n\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.348753 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-dns-svc\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.348834 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-config\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.349007 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-nb\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.349076 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-sb\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.369986 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xllfc"] Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.370144 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.374129 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.374333 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.374493 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-ktxkv\"" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.374619 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.376764 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"osp-secret\"" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.450710 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwz2\" (UniqueName: \"kubernetes.io/projected/007c0775-f489-40ca-a700-e8ed03a4365a-kube-api-access-5jwz2\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.450827 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-config\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.450875 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-combined-ca-bundle\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.450911 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-scripts\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.450936 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-credential-keys\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.450971 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-nb\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.450996 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-config-data\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.451040 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-sb\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.451113 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbf5n\" (UniqueName: \"kubernetes.io/projected/fbd62451-f99b-4030-a542-d749bbff084d-kube-api-access-hbf5n\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.451150 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-fernet-keys\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.451175 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-dns-svc\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.451763 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-config\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.452066 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-dns-svc\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.452549 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-nb\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.452669 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-sb\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.470082 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbf5n\" (UniqueName: \"kubernetes.io/projected/fbd62451-f99b-4030-a542-d749bbff084d-kube-api-access-hbf5n\") pod \"dnsmasq-dns-d4bf4bdcc-4k76c\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.552165 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-fernet-keys\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.552532 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwz2\" (UniqueName: \"kubernetes.io/projected/007c0775-f489-40ca-a700-e8ed03a4365a-kube-api-access-5jwz2\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.552658 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-combined-ca-bundle\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.552789 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-scripts\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.552872 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-credential-keys\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.552965 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-config-data\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.556988 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-fernet-keys\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.557239 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-config-data\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.557341 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-credential-keys\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.564670 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-scripts\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.573049 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-combined-ca-bundle\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.574810 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwz2\" (UniqueName: \"kubernetes.io/projected/007c0775-f489-40ca-a700-e8ed03a4365a-kube-api-access-5jwz2\") pod \"keystone-bootstrap-xllfc\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.650710 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:39 crc kubenswrapper[5200]: I1126 14:57:39.692493 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:40 crc kubenswrapper[5200]: I1126 14:57:40.099896 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d4bf4bdcc-4k76c"] Nov 26 14:57:40 crc kubenswrapper[5200]: W1126 14:57:40.101913 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbd62451_f99b_4030_a542_d749bbff084d.slice/crio-a2cada01148cbc68555a5c8de04c1afb78903fc60e738d0f1059e52b394a2cc7 WatchSource:0}: Error finding container a2cada01148cbc68555a5c8de04c1afb78903fc60e738d0f1059e52b394a2cc7: Status 404 returned error can't find the container with id a2cada01148cbc68555a5c8de04c1afb78903fc60e738d0f1059e52b394a2cc7 Nov 26 14:57:40 crc kubenswrapper[5200]: I1126 14:57:40.175936 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xllfc"] Nov 26 14:57:40 crc kubenswrapper[5200]: W1126 14:57:40.180223 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod007c0775_f489_40ca_a700_e8ed03a4365a.slice/crio-c1b7d54f814e54a1315d3292596ac49583496093eb0a8cc3f07c437743c38d51 WatchSource:0}: Error finding container c1b7d54f814e54a1315d3292596ac49583496093eb0a8cc3f07c437743c38d51: Status 404 returned error can't find the container with id c1b7d54f814e54a1315d3292596ac49583496093eb0a8cc3f07c437743c38d51 Nov 26 14:57:41 crc kubenswrapper[5200]: I1126 14:57:41.058045 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xllfc" event={"ID":"007c0775-f489-40ca-a700-e8ed03a4365a","Type":"ContainerStarted","Data":"74d6c21bb57ef5eaf5ce9e86ea093a5b52f956d560c01f74162995431febd430"} Nov 26 14:57:41 crc kubenswrapper[5200]: I1126 14:57:41.058493 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xllfc" event={"ID":"007c0775-f489-40ca-a700-e8ed03a4365a","Type":"ContainerStarted","Data":"c1b7d54f814e54a1315d3292596ac49583496093eb0a8cc3f07c437743c38d51"} Nov 26 14:57:41 crc kubenswrapper[5200]: I1126 14:57:41.059590 5200 generic.go:358] "Generic (PLEG): container finished" podID="fbd62451-f99b-4030-a542-d749bbff084d" containerID="00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a" exitCode=0 Nov 26 14:57:41 crc kubenswrapper[5200]: I1126 14:57:41.059859 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" event={"ID":"fbd62451-f99b-4030-a542-d749bbff084d","Type":"ContainerDied","Data":"00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a"} Nov 26 14:57:41 crc kubenswrapper[5200]: I1126 14:57:41.059886 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" event={"ID":"fbd62451-f99b-4030-a542-d749bbff084d","Type":"ContainerStarted","Data":"a2cada01148cbc68555a5c8de04c1afb78903fc60e738d0f1059e52b394a2cc7"} Nov 26 14:57:41 crc kubenswrapper[5200]: I1126 14:57:41.081883 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xllfc" podStartSLOduration=2.081824002 podStartE2EDuration="2.081824002s" podCreationTimestamp="2025-11-26 14:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:41.07944471 +0000 UTC m=+5229.758646528" watchObservedRunningTime="2025-11-26 14:57:41.081824002 +0000 UTC m=+5229.761025830" Nov 26 14:57:42 crc kubenswrapper[5200]: I1126 14:57:42.077446 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" event={"ID":"fbd62451-f99b-4030-a542-d749bbff084d","Type":"ContainerStarted","Data":"32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d"} Nov 26 14:57:42 crc kubenswrapper[5200]: I1126 14:57:42.077925 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:42 crc kubenswrapper[5200]: I1126 14:57:42.114775 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" podStartSLOduration=3.114759763 podStartE2EDuration="3.114759763s" podCreationTimestamp="2025-11-26 14:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:42.112128554 +0000 UTC m=+5230.791330372" watchObservedRunningTime="2025-11-26 14:57:42.114759763 +0000 UTC m=+5230.793961581" Nov 26 14:57:42 crc kubenswrapper[5200]: E1126 14:57:42.209478 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1487496 actualBytes=10240 Nov 26 14:57:44 crc kubenswrapper[5200]: I1126 14:57:44.095838 5200 generic.go:358] "Generic (PLEG): container finished" podID="007c0775-f489-40ca-a700-e8ed03a4365a" containerID="74d6c21bb57ef5eaf5ce9e86ea093a5b52f956d560c01f74162995431febd430" exitCode=0 Nov 26 14:57:44 crc kubenswrapper[5200]: I1126 14:57:44.095933 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xllfc" event={"ID":"007c0775-f489-40ca-a700-e8ed03a4365a","Type":"ContainerDied","Data":"74d6c21bb57ef5eaf5ce9e86ea093a5b52f956d560c01f74162995431febd430"} Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.473044 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.597518 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-fernet-keys\") pod \"007c0775-f489-40ca-a700-e8ed03a4365a\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.597563 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-credential-keys\") pod \"007c0775-f489-40ca-a700-e8ed03a4365a\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.597599 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-combined-ca-bundle\") pod \"007c0775-f489-40ca-a700-e8ed03a4365a\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.597795 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-scripts\") pod \"007c0775-f489-40ca-a700-e8ed03a4365a\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.597831 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-config-data\") pod \"007c0775-f489-40ca-a700-e8ed03a4365a\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.597882 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jwz2\" (UniqueName: \"kubernetes.io/projected/007c0775-f489-40ca-a700-e8ed03a4365a-kube-api-access-5jwz2\") pod \"007c0775-f489-40ca-a700-e8ed03a4365a\" (UID: \"007c0775-f489-40ca-a700-e8ed03a4365a\") " Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.604031 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007c0775-f489-40ca-a700-e8ed03a4365a-kube-api-access-5jwz2" (OuterVolumeSpecName: "kube-api-access-5jwz2") pod "007c0775-f489-40ca-a700-e8ed03a4365a" (UID: "007c0775-f489-40ca-a700-e8ed03a4365a"). InnerVolumeSpecName "kube-api-access-5jwz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.609873 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-scripts" (OuterVolumeSpecName: "scripts") pod "007c0775-f489-40ca-a700-e8ed03a4365a" (UID: "007c0775-f489-40ca-a700-e8ed03a4365a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.609942 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "007c0775-f489-40ca-a700-e8ed03a4365a" (UID: "007c0775-f489-40ca-a700-e8ed03a4365a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.610439 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "007c0775-f489-40ca-a700-e8ed03a4365a" (UID: "007c0775-f489-40ca-a700-e8ed03a4365a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.627094 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "007c0775-f489-40ca-a700-e8ed03a4365a" (UID: "007c0775-f489-40ca-a700-e8ed03a4365a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.635903 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-config-data" (OuterVolumeSpecName: "config-data") pod "007c0775-f489-40ca-a700-e8ed03a4365a" (UID: "007c0775-f489-40ca-a700-e8ed03a4365a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.705878 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.705915 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.705930 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jwz2\" (UniqueName: \"kubernetes.io/projected/007c0775-f489-40ca-a700-e8ed03a4365a-kube-api-access-5jwz2\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.705943 5200 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.705955 5200 reconciler_common.go:299] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:45 crc kubenswrapper[5200]: I1126 14:57:45.705967 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007c0775-f489-40ca-a700-e8ed03a4365a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.116552 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xllfc" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.116597 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xllfc" event={"ID":"007c0775-f489-40ca-a700-e8ed03a4365a","Type":"ContainerDied","Data":"c1b7d54f814e54a1315d3292596ac49583496093eb0a8cc3f07c437743c38d51"} Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.116661 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1b7d54f814e54a1315d3292596ac49583496093eb0a8cc3f07c437743c38d51" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.286592 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xllfc"] Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.294533 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xllfc"] Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.383619 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xmnrq"] Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.385239 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="007c0775-f489-40ca-a700-e8ed03a4365a" containerName="keystone-bootstrap" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.385277 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="007c0775-f489-40ca-a700-e8ed03a4365a" containerName="keystone-bootstrap" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.385477 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="007c0775-f489-40ca-a700-e8ed03a4365a" containerName="keystone-bootstrap" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.393730 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.393735 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xmnrq"] Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.396235 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.396865 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.396898 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"osp-secret\"" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.397074 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-ktxkv\"" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.397150 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.519828 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gtq8\" (UniqueName: \"kubernetes.io/projected/f8e3da23-c789-4052-ad28-f50b8b42cc52-kube-api-access-4gtq8\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.519906 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-fernet-keys\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.519961 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-config-data\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.520130 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-combined-ca-bundle\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.520183 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-scripts\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.520291 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-credential-keys\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.622843 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-credential-keys\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.623039 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gtq8\" (UniqueName: \"kubernetes.io/projected/f8e3da23-c789-4052-ad28-f50b8b42cc52-kube-api-access-4gtq8\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.623099 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-fernet-keys\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.623178 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-config-data\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.623238 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-combined-ca-bundle\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.623303 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-scripts\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.627656 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-credential-keys\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.627705 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-scripts\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.628482 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-config-data\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.628990 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-fernet-keys\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.637330 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-combined-ca-bundle\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.640549 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gtq8\" (UniqueName: \"kubernetes.io/projected/f8e3da23-c789-4052-ad28-f50b8b42cc52-kube-api-access-4gtq8\") pod \"keystone-bootstrap-xmnrq\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.712463 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:46 crc kubenswrapper[5200]: I1126 14:57:46.982224 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007c0775-f489-40ca-a700-e8ed03a4365a" path="/var/lib/kubelet/pods/007c0775-f489-40ca-a700-e8ed03a4365a/volumes" Nov 26 14:57:47 crc kubenswrapper[5200]: W1126 14:57:47.183818 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e3da23_c789_4052_ad28_f50b8b42cc52.slice/crio-d8e8050d9ed12d5364175dff19972060e21229e7ff67a4179fd5d203282e0651 WatchSource:0}: Error finding container d8e8050d9ed12d5364175dff19972060e21229e7ff67a4179fd5d203282e0651: Status 404 returned error can't find the container with id d8e8050d9ed12d5364175dff19972060e21229e7ff67a4179fd5d203282e0651 Nov 26 14:57:47 crc kubenswrapper[5200]: I1126 14:57:47.191092 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xmnrq"] Nov 26 14:57:47 crc kubenswrapper[5200]: I1126 14:57:47.969807 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:57:47 crc kubenswrapper[5200]: E1126 14:57:47.970544 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:57:48 crc kubenswrapper[5200]: I1126 14:57:48.135079 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xmnrq" event={"ID":"f8e3da23-c789-4052-ad28-f50b8b42cc52","Type":"ContainerStarted","Data":"1106ff9622ccfd2e054ba2412a00d94be6f0fbda0acbb7c41719b4059c523b51"} Nov 26 14:57:48 crc kubenswrapper[5200]: I1126 14:57:48.135144 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xmnrq" event={"ID":"f8e3da23-c789-4052-ad28-f50b8b42cc52","Type":"ContainerStarted","Data":"d8e8050d9ed12d5364175dff19972060e21229e7ff67a4179fd5d203282e0651"} Nov 26 14:57:48 crc kubenswrapper[5200]: I1126 14:57:48.154631 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xmnrq" podStartSLOduration=2.154617297 podStartE2EDuration="2.154617297s" podCreationTimestamp="2025-11-26 14:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:48.151417622 +0000 UTC m=+5236.830619510" watchObservedRunningTime="2025-11-26 14:57:48.154617297 +0000 UTC m=+5236.833819115" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.098837 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.152953 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-655b9d9769-v5czh"] Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.153538 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" podUID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" containerName="dnsmasq-dns" containerID="cri-o://1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e" gracePeriod=10 Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.638405 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.677549 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-dns-svc\") pod \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.677873 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxfbl\" (UniqueName: \"kubernetes.io/projected/46561b5f-ed2b-4ac6-924b-9feb39192c8a-kube-api-access-gxfbl\") pod \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.678093 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-config\") pod \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.678252 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-nb\") pod \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.679417 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-sb\") pod \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\" (UID: \"46561b5f-ed2b-4ac6-924b-9feb39192c8a\") " Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.684935 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46561b5f-ed2b-4ac6-924b-9feb39192c8a-kube-api-access-gxfbl" (OuterVolumeSpecName: "kube-api-access-gxfbl") pod "46561b5f-ed2b-4ac6-924b-9feb39192c8a" (UID: "46561b5f-ed2b-4ac6-924b-9feb39192c8a"). InnerVolumeSpecName "kube-api-access-gxfbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.737874 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-config" (OuterVolumeSpecName: "config") pod "46561b5f-ed2b-4ac6-924b-9feb39192c8a" (UID: "46561b5f-ed2b-4ac6-924b-9feb39192c8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.738790 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46561b5f-ed2b-4ac6-924b-9feb39192c8a" (UID: "46561b5f-ed2b-4ac6-924b-9feb39192c8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.746819 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46561b5f-ed2b-4ac6-924b-9feb39192c8a" (UID: "46561b5f-ed2b-4ac6-924b-9feb39192c8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.761666 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46561b5f-ed2b-4ac6-924b-9feb39192c8a" (UID: "46561b5f-ed2b-4ac6-924b-9feb39192c8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.782102 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxfbl\" (UniqueName: \"kubernetes.io/projected/46561b5f-ed2b-4ac6-924b-9feb39192c8a-kube-api-access-gxfbl\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.782470 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.782551 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.782617 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:49 crc kubenswrapper[5200]: I1126 14:57:49.782712 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46561b5f-ed2b-4ac6-924b-9feb39192c8a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.196322 5200 generic.go:358] "Generic (PLEG): container finished" podID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" containerID="1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e" exitCode=0 Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.197333 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.198948 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" event={"ID":"46561b5f-ed2b-4ac6-924b-9feb39192c8a","Type":"ContainerDied","Data":"1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e"} Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.199143 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655b9d9769-v5czh" event={"ID":"46561b5f-ed2b-4ac6-924b-9feb39192c8a","Type":"ContainerDied","Data":"98b8c0383c62bd481346d9548d3343e07b605beaea844679fea3a531ab79865f"} Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.199223 5200 scope.go:117] "RemoveContainer" containerID="1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e" Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.205513 5200 generic.go:358] "Generic (PLEG): container finished" podID="f8e3da23-c789-4052-ad28-f50b8b42cc52" containerID="1106ff9622ccfd2e054ba2412a00d94be6f0fbda0acbb7c41719b4059c523b51" exitCode=0 Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.206036 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xmnrq" event={"ID":"f8e3da23-c789-4052-ad28-f50b8b42cc52","Type":"ContainerDied","Data":"1106ff9622ccfd2e054ba2412a00d94be6f0fbda0acbb7c41719b4059c523b51"} Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.233277 5200 scope.go:117] "RemoveContainer" containerID="89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d" Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.256566 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-655b9d9769-v5czh"] Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.262748 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-655b9d9769-v5czh"] Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.264759 5200 scope.go:117] "RemoveContainer" containerID="1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e" Nov 26 14:57:50 crc kubenswrapper[5200]: E1126 14:57:50.265255 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e\": container with ID starting with 1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e not found: ID does not exist" containerID="1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e" Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.265307 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e"} err="failed to get container status \"1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e\": rpc error: code = NotFound desc = could not find container \"1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e\": container with ID starting with 1716f7eed9c178e2cc3ca1147e8240c32e2e03395464fe90de0d02c827b3be5e not found: ID does not exist" Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.265341 5200 scope.go:117] "RemoveContainer" containerID="89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d" Nov 26 14:57:50 crc kubenswrapper[5200]: E1126 14:57:50.265758 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d\": container with ID starting with 89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d not found: ID does not exist" containerID="89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d" Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.265788 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d"} err="failed to get container status \"89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d\": rpc error: code = NotFound desc = could not find container \"89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d\": container with ID starting with 89aacee8c886ac36a23b1e63139a5fa1f91b62ad1f6ab079f537f3b336d3c27d not found: ID does not exist" Nov 26 14:57:50 crc kubenswrapper[5200]: I1126 14:57:50.981908 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" path="/var/lib/kubelet/pods/46561b5f-ed2b-4ac6-924b-9feb39192c8a/volumes" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.620286 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.717355 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-scripts\") pod \"f8e3da23-c789-4052-ad28-f50b8b42cc52\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.717422 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-fernet-keys\") pod \"f8e3da23-c789-4052-ad28-f50b8b42cc52\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.717495 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-config-data\") pod \"f8e3da23-c789-4052-ad28-f50b8b42cc52\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.717521 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-credential-keys\") pod \"f8e3da23-c789-4052-ad28-f50b8b42cc52\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.717569 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-combined-ca-bundle\") pod \"f8e3da23-c789-4052-ad28-f50b8b42cc52\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.717723 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gtq8\" (UniqueName: \"kubernetes.io/projected/f8e3da23-c789-4052-ad28-f50b8b42cc52-kube-api-access-4gtq8\") pod \"f8e3da23-c789-4052-ad28-f50b8b42cc52\" (UID: \"f8e3da23-c789-4052-ad28-f50b8b42cc52\") " Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.724727 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f8e3da23-c789-4052-ad28-f50b8b42cc52" (UID: "f8e3da23-c789-4052-ad28-f50b8b42cc52"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.724759 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f8e3da23-c789-4052-ad28-f50b8b42cc52" (UID: "f8e3da23-c789-4052-ad28-f50b8b42cc52"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.725374 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e3da23-c789-4052-ad28-f50b8b42cc52-kube-api-access-4gtq8" (OuterVolumeSpecName: "kube-api-access-4gtq8") pod "f8e3da23-c789-4052-ad28-f50b8b42cc52" (UID: "f8e3da23-c789-4052-ad28-f50b8b42cc52"). InnerVolumeSpecName "kube-api-access-4gtq8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.726699 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-scripts" (OuterVolumeSpecName: "scripts") pod "f8e3da23-c789-4052-ad28-f50b8b42cc52" (UID: "f8e3da23-c789-4052-ad28-f50b8b42cc52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.742114 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-config-data" (OuterVolumeSpecName: "config-data") pod "f8e3da23-c789-4052-ad28-f50b8b42cc52" (UID: "f8e3da23-c789-4052-ad28-f50b8b42cc52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.743515 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e3da23-c789-4052-ad28-f50b8b42cc52" (UID: "f8e3da23-c789-4052-ad28-f50b8b42cc52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.819752 5200 reconciler_common.go:299] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.819800 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.819813 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4gtq8\" (UniqueName: \"kubernetes.io/projected/f8e3da23-c789-4052-ad28-f50b8b42cc52-kube-api-access-4gtq8\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.819827 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.819841 5200 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:51 crc kubenswrapper[5200]: I1126 14:57:51.819852 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3da23-c789-4052-ad28-f50b8b42cc52-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.237404 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xmnrq" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.237881 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xmnrq" event={"ID":"f8e3da23-c789-4052-ad28-f50b8b42cc52","Type":"ContainerDied","Data":"d8e8050d9ed12d5364175dff19972060e21229e7ff67a4179fd5d203282e0651"} Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.237919 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8e8050d9ed12d5364175dff19972060e21229e7ff67a4179fd5d203282e0651" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.301489 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-6f7d99cff4-gn8zk"] Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.303167 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e3da23-c789-4052-ad28-f50b8b42cc52" containerName="keystone-bootstrap" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.303196 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3da23-c789-4052-ad28-f50b8b42cc52" containerName="keystone-bootstrap" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.303235 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" containerName="dnsmasq-dns" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.303244 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" containerName="dnsmasq-dns" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.303293 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" containerName="init" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.303300 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" containerName="init" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.303486 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e3da23-c789-4052-ad28-f50b8b42cc52" containerName="keystone-bootstrap" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.303505 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="46561b5f-ed2b-4ac6-924b-9feb39192c8a" containerName="dnsmasq-dns" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.315896 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f7d99cff4-gn8zk"] Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.316047 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.318201 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.318522 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.319506 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-ktxkv\"" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.320145 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.365913 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-scripts\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.366002 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-config-data\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.366057 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-fernet-keys\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.366163 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-combined-ca-bundle\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.366181 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75h99\" (UniqueName: \"kubernetes.io/projected/146a6403-b94d-48dd-8682-827c7dcf966d-kube-api-access-75h99\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.366219 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-credential-keys\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.467156 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-config-data\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.467225 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-fernet-keys\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.467296 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-combined-ca-bundle\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.467316 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75h99\" (UniqueName: \"kubernetes.io/projected/146a6403-b94d-48dd-8682-827c7dcf966d-kube-api-access-75h99\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.467347 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-credential-keys\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.467383 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-scripts\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.473319 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-scripts\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.473490 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-config-data\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.473932 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-fernet-keys\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.474797 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-combined-ca-bundle\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.479859 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/146a6403-b94d-48dd-8682-827c7dcf966d-credential-keys\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.486643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75h99\" (UniqueName: \"kubernetes.io/projected/146a6403-b94d-48dd-8682-827c7dcf966d-kube-api-access-75h99\") pod \"keystone-6f7d99cff4-gn8zk\" (UID: \"146a6403-b94d-48dd-8682-827c7dcf966d\") " pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:52 crc kubenswrapper[5200]: I1126 14:57:52.684996 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:53 crc kubenswrapper[5200]: I1126 14:57:53.142887 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f7d99cff4-gn8zk"] Nov 26 14:57:53 crc kubenswrapper[5200]: I1126 14:57:53.247883 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f7d99cff4-gn8zk" event={"ID":"146a6403-b94d-48dd-8682-827c7dcf966d","Type":"ContainerStarted","Data":"7a566818c431fff122333b11049f5472959982100b5e3ab355c288bf761cb578"} Nov 26 14:57:54 crc kubenswrapper[5200]: I1126 14:57:54.258418 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f7d99cff4-gn8zk" event={"ID":"146a6403-b94d-48dd-8682-827c7dcf966d","Type":"ContainerStarted","Data":"a5c069a29a7cd7b04c7c1bb52ec5d68f6577f794cb8b2c2e03b4d9f14e595de0"} Nov 26 14:57:54 crc kubenswrapper[5200]: I1126 14:57:54.258959 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:57:54 crc kubenswrapper[5200]: I1126 14:57:54.281307 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6f7d99cff4-gn8zk" podStartSLOduration=2.281287925 podStartE2EDuration="2.281287925s" podCreationTimestamp="2025-11-26 14:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:57:54.280820503 +0000 UTC m=+5242.960022341" watchObservedRunningTime="2025-11-26 14:57:54.281287925 +0000 UTC m=+5242.960489743" Nov 26 14:57:59 crc kubenswrapper[5200]: I1126 14:57:59.969062 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:57:59 crc kubenswrapper[5200]: E1126 14:57:59.970133 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:58:13 crc kubenswrapper[5200]: I1126 14:58:13.969751 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:58:13 crc kubenswrapper[5200]: E1126 14:58:13.970830 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:58:24 crc kubenswrapper[5200]: I1126 14:58:24.969414 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:58:24 crc kubenswrapper[5200]: E1126 14:58:24.970521 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:58:26 crc kubenswrapper[5200]: I1126 14:58:26.606999 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6f7d99cff4-gn8zk" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.590324 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.646391 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.646739 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.649487 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-config\"" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.650451 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-config-secret\"" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.650980 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstackclient-openstackclient-dockercfg-m5kbp\"" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.710456 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.710680 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqgl5\" (UniqueName: \"kubernetes.io/projected/07a4d587-f5d3-4368-8100-5c07987395e7-kube-api-access-nqgl5\") pod \"openstackclient\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.711061 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config\") pod \"openstackclient\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.721614 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 26 14:58:27 crc kubenswrapper[5200]: E1126 14:58:27.722580 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-nqgl5 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="07a4d587-f5d3-4368-8100-5c07987395e7" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.733300 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.740728 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.768443 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.768581 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.771112 5200 scope.go:117] "RemoveContainer" containerID="202e7d0816f03fc4fa5bec3180c5e92399ed830edb9dd083883eaa360bd0fd01" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.812935 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.813281 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqgl5\" (UniqueName: \"kubernetes.io/projected/07a4d587-f5d3-4368-8100-5c07987395e7-kube-api-access-nqgl5\") pod \"openstackclient\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.813435 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config\") pod \"openstackclient\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.817504 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config\") pod \"openstackclient\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: E1126 14:58:27.817962 5200 projected.go:194] Error preparing data for projected volume kube-api-access-nqgl5 for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (07a4d587-f5d3-4368-8100-5c07987395e7) does not match the UID in record. The object might have been deleted and then recreated Nov 26 14:58:27 crc kubenswrapper[5200]: E1126 14:58:27.818056 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07a4d587-f5d3-4368-8100-5c07987395e7-kube-api-access-nqgl5 podName:07a4d587-f5d3-4368-8100-5c07987395e7 nodeName:}" failed. No retries permitted until 2025-11-26 14:58:28.318028798 +0000 UTC m=+5276.997230606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nqgl5" (UniqueName: "kubernetes.io/projected/07a4d587-f5d3-4368-8100-5c07987395e7-kube-api-access-nqgl5") pod "openstackclient" (UID: "07a4d587-f5d3-4368-8100-5c07987395e7") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (07a4d587-f5d3-4368-8100-5c07987395e7) does not match the UID in record. The object might have been deleted and then recreated Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.823943 5200 scope.go:117] "RemoveContainer" containerID="a372b5a0dfcc36b71e3d7f4ccd0156e40d2ee131a8243ae15bf7ec47c103dd48" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.824286 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.913972 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.914755 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config-secret\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.914874 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.914934 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw4ft\" (UniqueName: \"kubernetes.io/projected/6f802e4d-ea2f-4754-ab33-a388f2b1efef-kube-api-access-hw4ft\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.919345 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="07a4d587-f5d3-4368-8100-5c07987395e7" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" Nov 26 14:58:27 crc kubenswrapper[5200]: I1126 14:58:27.940324 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.017068 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config-secret\") pod \"07a4d587-f5d3-4368-8100-5c07987395e7\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.017205 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config\") pod \"07a4d587-f5d3-4368-8100-5c07987395e7\" (UID: \"07a4d587-f5d3-4368-8100-5c07987395e7\") " Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.017458 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw4ft\" (UniqueName: \"kubernetes.io/projected/6f802e4d-ea2f-4754-ab33-a388f2b1efef-kube-api-access-hw4ft\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.017595 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config-secret\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.017735 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.017857 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nqgl5\" (UniqueName: \"kubernetes.io/projected/07a4d587-f5d3-4368-8100-5c07987395e7-kube-api-access-nqgl5\") on node \"crc\" DevicePath \"\"" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.018910 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.019322 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "07a4d587-f5d3-4368-8100-5c07987395e7" (UID: "07a4d587-f5d3-4368-8100-5c07987395e7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.024330 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config-secret\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.025628 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "07a4d587-f5d3-4368-8100-5c07987395e7" (UID: "07a4d587-f5d3-4368-8100-5c07987395e7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.040457 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw4ft\" (UniqueName: \"kubernetes.io/projected/6f802e4d-ea2f-4754-ab33-a388f2b1efef-kube-api-access-hw4ft\") pod \"openstackclient\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.106649 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.124309 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.124352 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07a4d587-f5d3-4368-8100-5c07987395e7-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.418702 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.930106 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6f802e4d-ea2f-4754-ab33-a388f2b1efef","Type":"ContainerStarted","Data":"fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a"} Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.930670 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6f802e4d-ea2f-4754-ab33-a388f2b1efef","Type":"ContainerStarted","Data":"57d8119fb670ca69d685b3f6cf0dde97afe68a93d1c7c7f5f0906b1e46a8988c"} Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.930205 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.964062 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="07a4d587-f5d3-4368-8100-5c07987395e7" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.968831 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.968804725 podStartE2EDuration="1.968804725s" podCreationTimestamp="2025-11-26 14:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:58:28.959096819 +0000 UTC m=+5277.638298647" watchObservedRunningTime="2025-11-26 14:58:28.968804725 +0000 UTC m=+5277.648006573" Nov 26 14:58:28 crc kubenswrapper[5200]: I1126 14:58:28.993349 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a4d587-f5d3-4368-8100-5c07987395e7" path="/var/lib/kubelet/pods/07a4d587-f5d3-4368-8100-5c07987395e7/volumes" Nov 26 14:58:39 crc kubenswrapper[5200]: I1126 14:58:39.973317 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:58:39 crc kubenswrapper[5200]: E1126 14:58:39.975041 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:58:42 crc kubenswrapper[5200]: E1126 14:58:42.079654 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1487397 actualBytes=10240 Nov 26 14:58:53 crc kubenswrapper[5200]: I1126 14:58:53.969287 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:58:53 crc kubenswrapper[5200]: E1126 14:58:53.970210 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:59:08 crc kubenswrapper[5200]: I1126 14:59:08.969512 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:59:08 crc kubenswrapper[5200]: E1126 14:59:08.972331 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:59:21 crc kubenswrapper[5200]: I1126 14:59:21.970609 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:59:21 crc kubenswrapper[5200]: E1126 14:59:21.972332 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.605074 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f29zg"] Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.614388 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.618746 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f29zg"] Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.645569 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds62d\" (UniqueName: \"kubernetes.io/projected/bc17dc43-8f4d-4fce-af04-702d98df7231-kube-api-access-ds62d\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.645644 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-utilities\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.645789 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-catalog-content\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.747707 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds62d\" (UniqueName: \"kubernetes.io/projected/bc17dc43-8f4d-4fce-af04-702d98df7231-kube-api-access-ds62d\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.747790 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-utilities\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.747863 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-catalog-content\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.749131 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-utilities\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.749197 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-catalog-content\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.769934 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds62d\" (UniqueName: \"kubernetes.io/projected/bc17dc43-8f4d-4fce-af04-702d98df7231-kube-api-access-ds62d\") pod \"community-operators-f29zg\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:23 crc kubenswrapper[5200]: I1126 14:59:23.942298 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:24 crc kubenswrapper[5200]: I1126 14:59:24.506992 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f29zg"] Nov 26 14:59:24 crc kubenswrapper[5200]: I1126 14:59:24.583409 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f29zg" event={"ID":"bc17dc43-8f4d-4fce-af04-702d98df7231","Type":"ContainerStarted","Data":"7e4005ba40d6f834ccb09b0e0b712841e369fe2b633b5773a7f94d920d506fd5"} Nov 26 14:59:25 crc kubenswrapper[5200]: I1126 14:59:25.593513 5200 generic.go:358] "Generic (PLEG): container finished" podID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerID="90c949097da57d8a36f09563fb7602ed9432606ea5e919a471e029a24555c577" exitCode=0 Nov 26 14:59:25 crc kubenswrapper[5200]: I1126 14:59:25.593583 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f29zg" event={"ID":"bc17dc43-8f4d-4fce-af04-702d98df7231","Type":"ContainerDied","Data":"90c949097da57d8a36f09563fb7602ed9432606ea5e919a471e029a24555c577"} Nov 26 14:59:29 crc kubenswrapper[5200]: I1126 14:59:29.633764 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f29zg" event={"ID":"bc17dc43-8f4d-4fce-af04-702d98df7231","Type":"ContainerStarted","Data":"d6fba336bb4a3fcdb552d6d30950da5c8b9b1bbb78cadab24b3598b9080ddc95"} Nov 26 14:59:30 crc kubenswrapper[5200]: I1126 14:59:30.641525 5200 generic.go:358] "Generic (PLEG): container finished" podID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerID="d6fba336bb4a3fcdb552d6d30950da5c8b9b1bbb78cadab24b3598b9080ddc95" exitCode=0 Nov 26 14:59:30 crc kubenswrapper[5200]: I1126 14:59:30.641638 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f29zg" event={"ID":"bc17dc43-8f4d-4fce-af04-702d98df7231","Type":"ContainerDied","Data":"d6fba336bb4a3fcdb552d6d30950da5c8b9b1bbb78cadab24b3598b9080ddc95"} Nov 26 14:59:31 crc kubenswrapper[5200]: I1126 14:59:31.652157 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f29zg" event={"ID":"bc17dc43-8f4d-4fce-af04-702d98df7231","Type":"ContainerStarted","Data":"271baf04f9b1184de2b53ac3db2448e1f9fe3d751ed9c3fdb379ea736aefe285"} Nov 26 14:59:31 crc kubenswrapper[5200]: I1126 14:59:31.676286 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f29zg" podStartSLOduration=4.857736689 podStartE2EDuration="8.676264078s" podCreationTimestamp="2025-11-26 14:59:23 +0000 UTC" firstStartedPulling="2025-11-26 14:59:25.595501262 +0000 UTC m=+5334.274703110" lastFinishedPulling="2025-11-26 14:59:29.414028661 +0000 UTC m=+5338.093230499" observedRunningTime="2025-11-26 14:59:31.670313051 +0000 UTC m=+5340.349514879" watchObservedRunningTime="2025-11-26 14:59:31.676264078 +0000 UTC m=+5340.355465906" Nov 26 14:59:33 crc kubenswrapper[5200]: I1126 14:59:33.943127 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:33 crc kubenswrapper[5200]: I1126 14:59:33.943997 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:33 crc kubenswrapper[5200]: I1126 14:59:33.970363 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:59:33 crc kubenswrapper[5200]: E1126 14:59:33.970635 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:59:34 crc kubenswrapper[5200]: I1126 14:59:34.006614 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:41 crc kubenswrapper[5200]: E1126 14:59:41.938319 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1495421 actualBytes=10240 Nov 26 14:59:44 crc kubenswrapper[5200]: I1126 14:59:44.733819 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f29zg" Nov 26 14:59:44 crc kubenswrapper[5200]: I1126 14:59:44.835873 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f29zg"] Nov 26 14:59:44 crc kubenswrapper[5200]: I1126 14:59:44.881794 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nxr6n"] Nov 26 14:59:44 crc kubenswrapper[5200]: I1126 14:59:44.882144 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nxr6n" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerName="registry-server" containerID="cri-o://6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d" gracePeriod=2 Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.397730 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxr6n" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.489334 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-catalog-content\") pod \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.489501 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-utilities\") pod \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.489579 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7z5l\" (UniqueName: \"kubernetes.io/projected/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-kube-api-access-c7z5l\") pod \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\" (UID: \"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c\") " Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.493934 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-utilities" (OuterVolumeSpecName: "utilities") pod "c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" (UID: "c1c896c5-a00d-4794-9c6b-0f4e7eefb79c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.507928 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-kube-api-access-c7z5l" (OuterVolumeSpecName: "kube-api-access-c7z5l") pod "c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" (UID: "c1c896c5-a00d-4794-9c6b-0f4e7eefb79c"). InnerVolumeSpecName "kube-api-access-c7z5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.534422 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" (UID: "c1c896c5-a00d-4794-9c6b-0f4e7eefb79c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.591229 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.591261 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.591272 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7z5l\" (UniqueName: \"kubernetes.io/projected/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c-kube-api-access-c7z5l\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.778867 5200 generic.go:358] "Generic (PLEG): container finished" podID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerID="6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d" exitCode=0 Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.779229 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxr6n" event={"ID":"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c","Type":"ContainerDied","Data":"6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d"} Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.779259 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxr6n" event={"ID":"c1c896c5-a00d-4794-9c6b-0f4e7eefb79c","Type":"ContainerDied","Data":"87a61c420e19e4e9c5deb9794f84d61bc71ae4990d5af408c031a335819cfa92"} Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.779277 5200 scope.go:117] "RemoveContainer" containerID="6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.779429 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxr6n" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.810173 5200 scope.go:117] "RemoveContainer" containerID="fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.816784 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nxr6n"] Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.821739 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nxr6n"] Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.839166 5200 scope.go:117] "RemoveContainer" containerID="77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.863021 5200 scope.go:117] "RemoveContainer" containerID="6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d" Nov 26 14:59:45 crc kubenswrapper[5200]: E1126 14:59:45.863394 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d\": container with ID starting with 6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d not found: ID does not exist" containerID="6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.863424 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d"} err="failed to get container status \"6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d\": rpc error: code = NotFound desc = could not find container \"6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d\": container with ID starting with 6a1037a3dfb89e6e647cc131c6565e6d941181e8321dbdb9c3ea423c2cdc217d not found: ID does not exist" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.863445 5200 scope.go:117] "RemoveContainer" containerID="fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e" Nov 26 14:59:45 crc kubenswrapper[5200]: E1126 14:59:45.863842 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e\": container with ID starting with fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e not found: ID does not exist" containerID="fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.863860 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e"} err="failed to get container status \"fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e\": rpc error: code = NotFound desc = could not find container \"fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e\": container with ID starting with fb6b40edbb60656f061e004d697e10afb76c55713fe10fefea13f8805fd1fc7e not found: ID does not exist" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.863872 5200 scope.go:117] "RemoveContainer" containerID="77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea" Nov 26 14:59:45 crc kubenswrapper[5200]: E1126 14:59:45.864147 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea\": container with ID starting with 77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea not found: ID does not exist" containerID="77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.864169 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea"} err="failed to get container status \"77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea\": rpc error: code = NotFound desc = could not find container \"77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea\": container with ID starting with 77f7ebc982be6ca082c5eb2cd364254e63ba7cf8dc4acbff288d437b1220b8ea not found: ID does not exist" Nov 26 14:59:45 crc kubenswrapper[5200]: I1126 14:59:45.971924 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:59:45 crc kubenswrapper[5200]: E1126 14:59:45.972596 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 14:59:46 crc kubenswrapper[5200]: I1126 14:59:46.984134 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" path="/var/lib/kubelet/pods/c1c896c5-a00d-4794-9c6b-0f4e7eefb79c/volumes" Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.800225 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9mp8k"] Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.806071 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerName="extract-utilities" Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.806145 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerName="extract-utilities" Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.806225 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerName="registry-server" Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.806244 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerName="registry-server" Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.806288 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerName="extract-content" Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.806303 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerName="extract-content" Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.806783 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1c896c5-a00d-4794-9c6b-0f4e7eefb79c" containerName="registry-server" Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.833855 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mp8k"] Nov 26 14:59:51 crc kubenswrapper[5200]: I1126 14:59:51.834096 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.006723 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-catalog-content\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.006780 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzkh\" (UniqueName: \"kubernetes.io/projected/ce738762-6aa9-4251-9772-53558aa9ba2b-kube-api-access-vpzkh\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.006836 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-utilities\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.108047 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-catalog-content\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.108121 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzkh\" (UniqueName: \"kubernetes.io/projected/ce738762-6aa9-4251-9772-53558aa9ba2b-kube-api-access-vpzkh\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.108154 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-utilities\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.108595 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-catalog-content\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.108621 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-utilities\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.134167 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzkh\" (UniqueName: \"kubernetes.io/projected/ce738762-6aa9-4251-9772-53558aa9ba2b-kube-api-access-vpzkh\") pod \"redhat-marketplace-9mp8k\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.157319 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.587156 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mp8k"] Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.593940 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.844139 5200 generic.go:358] "Generic (PLEG): container finished" podID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerID="17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a" exitCode=0 Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.844201 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mp8k" event={"ID":"ce738762-6aa9-4251-9772-53558aa9ba2b","Type":"ContainerDied","Data":"17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a"} Nov 26 14:59:52 crc kubenswrapper[5200]: I1126 14:59:52.844562 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mp8k" event={"ID":"ce738762-6aa9-4251-9772-53558aa9ba2b","Type":"ContainerStarted","Data":"88f47d261fb0acba9fb204b96264c1dc8695a1463761bbed915373dd19832168"} Nov 26 14:59:54 crc kubenswrapper[5200]: I1126 14:59:54.869332 5200 generic.go:358] "Generic (PLEG): container finished" podID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerID="3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726" exitCode=0 Nov 26 14:59:54 crc kubenswrapper[5200]: I1126 14:59:54.869398 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mp8k" event={"ID":"ce738762-6aa9-4251-9772-53558aa9ba2b","Type":"ContainerDied","Data":"3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726"} Nov 26 14:59:55 crc kubenswrapper[5200]: I1126 14:59:55.879950 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mp8k" event={"ID":"ce738762-6aa9-4251-9772-53558aa9ba2b","Type":"ContainerStarted","Data":"62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed"} Nov 26 14:59:55 crc kubenswrapper[5200]: I1126 14:59:55.905571 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9mp8k" podStartSLOduration=3.7526633350000003 podStartE2EDuration="4.905555268s" podCreationTimestamp="2025-11-26 14:59:51 +0000 UTC" firstStartedPulling="2025-11-26 14:59:52.845567852 +0000 UTC m=+5361.524769670" lastFinishedPulling="2025-11-26 14:59:53.998459775 +0000 UTC m=+5362.677661603" observedRunningTime="2025-11-26 14:59:55.899247341 +0000 UTC m=+5364.578449179" watchObservedRunningTime="2025-11-26 14:59:55.905555268 +0000 UTC m=+5364.584757086" Nov 26 14:59:58 crc kubenswrapper[5200]: I1126 14:59:58.969390 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 14:59:58 crc kubenswrapper[5200]: E1126 14:59:58.970461 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.139019 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp"] Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.148549 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.150771 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.151214 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.200186 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp"] Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.218319 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-config-volume\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.218367 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-secret-volume\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.218409 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dx6j\" (UniqueName: \"kubernetes.io/projected/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-kube-api-access-8dx6j\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.321002 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-config-volume\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.321078 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-secret-volume\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.321139 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dx6j\" (UniqueName: \"kubernetes.io/projected/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-kube-api-access-8dx6j\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.322730 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-config-volume\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.340466 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dx6j\" (UniqueName: \"kubernetes.io/projected/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-kube-api-access-8dx6j\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.341155 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-secret-volume\") pod \"collect-profiles-29402820-fjwsp\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.496120 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.914235 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp"] Nov 26 15:00:00 crc kubenswrapper[5200]: I1126 15:00:00.938970 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" event={"ID":"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d","Type":"ContainerStarted","Data":"3e4499b92994b33deebacafb7f3ebe1ed815c859f0e819d8e4c99bf56bd230f2"} Nov 26 15:00:01 crc kubenswrapper[5200]: I1126 15:00:01.948236 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d" containerID="13b1bf8445b805d72c8c617956efabb4d82b7967303a50729266a9119403b78d" exitCode=0 Nov 26 15:00:01 crc kubenswrapper[5200]: I1126 15:00:01.948338 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" event={"ID":"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d","Type":"ContainerDied","Data":"13b1bf8445b805d72c8c617956efabb4d82b7967303a50729266a9119403b78d"} Nov 26 15:00:02 crc kubenswrapper[5200]: I1126 15:00:02.158325 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 15:00:02 crc kubenswrapper[5200]: I1126 15:00:02.158376 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 15:00:02 crc kubenswrapper[5200]: I1126 15:00:02.209031 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.020071 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.062409 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mp8k"] Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.325793 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.379237 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-config-volume\") pod \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.379530 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dx6j\" (UniqueName: \"kubernetes.io/projected/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-kube-api-access-8dx6j\") pod \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.379593 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-secret-volume\") pod \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\" (UID: \"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d\") " Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.380060 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d" (UID: "a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.390034 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d" (UID: "a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.390616 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-kube-api-access-8dx6j" (OuterVolumeSpecName: "kube-api-access-8dx6j") pod "a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d" (UID: "a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d"). InnerVolumeSpecName "kube-api-access-8dx6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.482020 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dx6j\" (UniqueName: \"kubernetes.io/projected/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-kube-api-access-8dx6j\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.482063 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.482095 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.979770 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.979766 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp" event={"ID":"a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d","Type":"ContainerDied","Data":"3e4499b92994b33deebacafb7f3ebe1ed815c859f0e819d8e4c99bf56bd230f2"} Nov 26 15:00:03 crc kubenswrapper[5200]: I1126 15:00:03.979964 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e4499b92994b33deebacafb7f3ebe1ed815c859f0e819d8e4c99bf56bd230f2" Nov 26 15:00:04 crc kubenswrapper[5200]: I1126 15:00:04.396175 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r"] Nov 26 15:00:04 crc kubenswrapper[5200]: I1126 15:00:04.416993 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402775-j5z6r"] Nov 26 15:00:04 crc kubenswrapper[5200]: I1126 15:00:04.984876 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b745b09b-a67d-4aba-b374-fd012ef56f2c" path="/var/lib/kubelet/pods/b745b09b-a67d-4aba-b374-fd012ef56f2c/volumes" Nov 26 15:00:04 crc kubenswrapper[5200]: I1126 15:00:04.987968 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9mp8k" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerName="registry-server" containerID="cri-o://62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed" gracePeriod=2 Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.423447 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.521124 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-catalog-content\") pod \"ce738762-6aa9-4251-9772-53558aa9ba2b\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.521190 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpzkh\" (UniqueName: \"kubernetes.io/projected/ce738762-6aa9-4251-9772-53558aa9ba2b-kube-api-access-vpzkh\") pod \"ce738762-6aa9-4251-9772-53558aa9ba2b\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.521354 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-utilities\") pod \"ce738762-6aa9-4251-9772-53558aa9ba2b\" (UID: \"ce738762-6aa9-4251-9772-53558aa9ba2b\") " Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.522374 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-utilities" (OuterVolumeSpecName: "utilities") pod "ce738762-6aa9-4251-9772-53558aa9ba2b" (UID: "ce738762-6aa9-4251-9772-53558aa9ba2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.527131 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce738762-6aa9-4251-9772-53558aa9ba2b-kube-api-access-vpzkh" (OuterVolumeSpecName: "kube-api-access-vpzkh") pod "ce738762-6aa9-4251-9772-53558aa9ba2b" (UID: "ce738762-6aa9-4251-9772-53558aa9ba2b"). InnerVolumeSpecName "kube-api-access-vpzkh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.531677 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce738762-6aa9-4251-9772-53558aa9ba2b" (UID: "ce738762-6aa9-4251-9772-53558aa9ba2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.623279 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpzkh\" (UniqueName: \"kubernetes.io/projected/ce738762-6aa9-4251-9772-53558aa9ba2b-kube-api-access-vpzkh\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.623313 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:05 crc kubenswrapper[5200]: I1126 15:00:05.623325 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce738762-6aa9-4251-9772-53558aa9ba2b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.001159 5200 generic.go:358] "Generic (PLEG): container finished" podID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerID="62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed" exitCode=0 Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.001301 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mp8k" event={"ID":"ce738762-6aa9-4251-9772-53558aa9ba2b","Type":"ContainerDied","Data":"62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed"} Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.001337 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mp8k" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.001384 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mp8k" event={"ID":"ce738762-6aa9-4251-9772-53558aa9ba2b","Type":"ContainerDied","Data":"88f47d261fb0acba9fb204b96264c1dc8695a1463761bbed915373dd19832168"} Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.001424 5200 scope.go:117] "RemoveContainer" containerID="62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.031755 5200 scope.go:117] "RemoveContainer" containerID="3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.061954 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mp8k"] Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.068247 5200 scope.go:117] "RemoveContainer" containerID="17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.068448 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mp8k"] Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.096273 5200 scope.go:117] "RemoveContainer" containerID="62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed" Nov 26 15:00:06 crc kubenswrapper[5200]: E1126 15:00:06.096764 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed\": container with ID starting with 62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed not found: ID does not exist" containerID="62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.096799 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed"} err="failed to get container status \"62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed\": rpc error: code = NotFound desc = could not find container \"62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed\": container with ID starting with 62b607b2363012a203f09d9939b61c4577e9018bbc76f8134eb0d2139310d9ed not found: ID does not exist" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.096821 5200 scope.go:117] "RemoveContainer" containerID="3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726" Nov 26 15:00:06 crc kubenswrapper[5200]: E1126 15:00:06.097182 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726\": container with ID starting with 3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726 not found: ID does not exist" containerID="3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.097244 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726"} err="failed to get container status \"3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726\": rpc error: code = NotFound desc = could not find container \"3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726\": container with ID starting with 3c2639edea36fd52515f047f3649c59c1bce26de498e5e06a9653de3a15ec726 not found: ID does not exist" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.097283 5200 scope.go:117] "RemoveContainer" containerID="17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a" Nov 26 15:00:06 crc kubenswrapper[5200]: E1126 15:00:06.098323 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a\": container with ID starting with 17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a not found: ID does not exist" containerID="17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.098358 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a"} err="failed to get container status \"17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a\": rpc error: code = NotFound desc = could not find container \"17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a\": container with ID starting with 17f7d7439742290caa1b79520644c0d1f05f4e909f20d3b0454484f0522e393a not found: ID does not exist" Nov 26 15:00:06 crc kubenswrapper[5200]: I1126 15:00:06.979800 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" path="/var/lib/kubelet/pods/ce738762-6aa9-4251-9772-53558aa9ba2b/volumes" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.530809 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-l4zh7"] Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532238 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerName="extract-content" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532264 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerName="extract-content" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532292 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d" containerName="collect-profiles" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532301 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d" containerName="collect-profiles" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532362 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerName="extract-utilities" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532374 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerName="extract-utilities" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532389 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerName="registry-server" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532399 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerName="registry-server" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532595 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d" containerName="collect-profiles" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.532609 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce738762-6aa9-4251-9772-53558aa9ba2b" containerName="registry-server" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.538108 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.539483 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-af71-account-create-update-7rx84"] Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.550082 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.551978 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-db-secret\"" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.552009 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l4zh7"] Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.559279 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-af71-account-create-update-7rx84"] Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.657996 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-operator-scripts\") pod \"barbican-af71-account-create-update-7rx84\" (UID: \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\") " pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.658055 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16627b-f04e-40f8-97fd-66dfef695d5c-operator-scripts\") pod \"barbican-db-create-l4zh7\" (UID: \"0d16627b-f04e-40f8-97fd-66dfef695d5c\") " pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.658084 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzcxt\" (UniqueName: \"kubernetes.io/projected/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-kube-api-access-dzcxt\") pod \"barbican-af71-account-create-update-7rx84\" (UID: \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\") " pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.658232 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdq5\" (UniqueName: \"kubernetes.io/projected/0d16627b-f04e-40f8-97fd-66dfef695d5c-kube-api-access-kvdq5\") pod \"barbican-db-create-l4zh7\" (UID: \"0d16627b-f04e-40f8-97fd-66dfef695d5c\") " pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.759884 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16627b-f04e-40f8-97fd-66dfef695d5c-operator-scripts\") pod \"barbican-db-create-l4zh7\" (UID: \"0d16627b-f04e-40f8-97fd-66dfef695d5c\") " pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.759952 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzcxt\" (UniqueName: \"kubernetes.io/projected/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-kube-api-access-dzcxt\") pod \"barbican-af71-account-create-update-7rx84\" (UID: \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\") " pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.759981 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdq5\" (UniqueName: \"kubernetes.io/projected/0d16627b-f04e-40f8-97fd-66dfef695d5c-kube-api-access-kvdq5\") pod \"barbican-db-create-l4zh7\" (UID: \"0d16627b-f04e-40f8-97fd-66dfef695d5c\") " pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.760477 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-operator-scripts\") pod \"barbican-af71-account-create-update-7rx84\" (UID: \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\") " pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.761001 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16627b-f04e-40f8-97fd-66dfef695d5c-operator-scripts\") pod \"barbican-db-create-l4zh7\" (UID: \"0d16627b-f04e-40f8-97fd-66dfef695d5c\") " pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.761063 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-operator-scripts\") pod \"barbican-af71-account-create-update-7rx84\" (UID: \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\") " pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.778618 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdq5\" (UniqueName: \"kubernetes.io/projected/0d16627b-f04e-40f8-97fd-66dfef695d5c-kube-api-access-kvdq5\") pod \"barbican-db-create-l4zh7\" (UID: \"0d16627b-f04e-40f8-97fd-66dfef695d5c\") " pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.778619 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzcxt\" (UniqueName: \"kubernetes.io/projected/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-kube-api-access-dzcxt\") pod \"barbican-af71-account-create-update-7rx84\" (UID: \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\") " pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.858555 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:07 crc kubenswrapper[5200]: I1126 15:00:07.874046 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:08 crc kubenswrapper[5200]: I1126 15:00:08.317781 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l4zh7"] Nov 26 15:00:08 crc kubenswrapper[5200]: I1126 15:00:08.385044 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-af71-account-create-update-7rx84"] Nov 26 15:00:08 crc kubenswrapper[5200]: W1126 15:00:08.387947 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24e538b2_ca3f_48bf_85ad_abf37d2cf9ab.slice/crio-d64f6cd7f06a92734dad50b7a71d48ded7974d513d5f7a5fefa1861d0482e5f3 WatchSource:0}: Error finding container d64f6cd7f06a92734dad50b7a71d48ded7974d513d5f7a5fefa1861d0482e5f3: Status 404 returned error can't find the container with id d64f6cd7f06a92734dad50b7a71d48ded7974d513d5f7a5fefa1861d0482e5f3 Nov 26 15:00:09 crc kubenswrapper[5200]: I1126 15:00:09.041351 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l4zh7" event={"ID":"0d16627b-f04e-40f8-97fd-66dfef695d5c","Type":"ContainerDied","Data":"af1c3d825cf6c6bf07f06626bd0e2c0992c9015d47cb86a3a060d24b55fac3a6"} Nov 26 15:00:09 crc kubenswrapper[5200]: I1126 15:00:09.041294 5200 generic.go:358] "Generic (PLEG): container finished" podID="0d16627b-f04e-40f8-97fd-66dfef695d5c" containerID="af1c3d825cf6c6bf07f06626bd0e2c0992c9015d47cb86a3a060d24b55fac3a6" exitCode=0 Nov 26 15:00:09 crc kubenswrapper[5200]: I1126 15:00:09.041616 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l4zh7" event={"ID":"0d16627b-f04e-40f8-97fd-66dfef695d5c","Type":"ContainerStarted","Data":"f9f925bdebd2c98e40af968e172dd0496f4df6ac330887ba91a4dee4e8e243da"} Nov 26 15:00:09 crc kubenswrapper[5200]: I1126 15:00:09.043530 5200 generic.go:358] "Generic (PLEG): container finished" podID="24e538b2-ca3f-48bf-85ad-abf37d2cf9ab" containerID="791821d66e1f8e8b70c4935a2399a576d89835d7b09ab5e09ba7317f1aec41d1" exitCode=0 Nov 26 15:00:09 crc kubenswrapper[5200]: I1126 15:00:09.043699 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af71-account-create-update-7rx84" event={"ID":"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab","Type":"ContainerDied","Data":"791821d66e1f8e8b70c4935a2399a576d89835d7b09ab5e09ba7317f1aec41d1"} Nov 26 15:00:09 crc kubenswrapper[5200]: I1126 15:00:09.043738 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af71-account-create-update-7rx84" event={"ID":"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab","Type":"ContainerStarted","Data":"d64f6cd7f06a92734dad50b7a71d48ded7974d513d5f7a5fefa1861d0482e5f3"} Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.471384 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.476303 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.614560 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvdq5\" (UniqueName: \"kubernetes.io/projected/0d16627b-f04e-40f8-97fd-66dfef695d5c-kube-api-access-kvdq5\") pod \"0d16627b-f04e-40f8-97fd-66dfef695d5c\" (UID: \"0d16627b-f04e-40f8-97fd-66dfef695d5c\") " Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.614618 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-operator-scripts\") pod \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\" (UID: \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\") " Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.614656 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16627b-f04e-40f8-97fd-66dfef695d5c-operator-scripts\") pod \"0d16627b-f04e-40f8-97fd-66dfef695d5c\" (UID: \"0d16627b-f04e-40f8-97fd-66dfef695d5c\") " Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.614723 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzcxt\" (UniqueName: \"kubernetes.io/projected/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-kube-api-access-dzcxt\") pod \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\" (UID: \"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab\") " Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.615342 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24e538b2-ca3f-48bf-85ad-abf37d2cf9ab" (UID: "24e538b2-ca3f-48bf-85ad-abf37d2cf9ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.616229 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d16627b-f04e-40f8-97fd-66dfef695d5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d16627b-f04e-40f8-97fd-66dfef695d5c" (UID: "0d16627b-f04e-40f8-97fd-66dfef695d5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.621355 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d16627b-f04e-40f8-97fd-66dfef695d5c-kube-api-access-kvdq5" (OuterVolumeSpecName: "kube-api-access-kvdq5") pod "0d16627b-f04e-40f8-97fd-66dfef695d5c" (UID: "0d16627b-f04e-40f8-97fd-66dfef695d5c"). InnerVolumeSpecName "kube-api-access-kvdq5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.622926 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-kube-api-access-dzcxt" (OuterVolumeSpecName: "kube-api-access-dzcxt") pod "24e538b2-ca3f-48bf-85ad-abf37d2cf9ab" (UID: "24e538b2-ca3f-48bf-85ad-abf37d2cf9ab"). InnerVolumeSpecName "kube-api-access-dzcxt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.717896 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvdq5\" (UniqueName: \"kubernetes.io/projected/0d16627b-f04e-40f8-97fd-66dfef695d5c-kube-api-access-kvdq5\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.717965 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.717991 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16627b-f04e-40f8-97fd-66dfef695d5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.718015 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzcxt\" (UniqueName: \"kubernetes.io/projected/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab-kube-api-access-dzcxt\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:10 crc kubenswrapper[5200]: I1126 15:00:10.969790 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 15:00:11 crc kubenswrapper[5200]: I1126 15:00:11.062490 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af71-account-create-update-7rx84" Nov 26 15:00:11 crc kubenswrapper[5200]: I1126 15:00:11.062516 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af71-account-create-update-7rx84" event={"ID":"24e538b2-ca3f-48bf-85ad-abf37d2cf9ab","Type":"ContainerDied","Data":"d64f6cd7f06a92734dad50b7a71d48ded7974d513d5f7a5fefa1861d0482e5f3"} Nov 26 15:00:11 crc kubenswrapper[5200]: I1126 15:00:11.063847 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64f6cd7f06a92734dad50b7a71d48ded7974d513d5f7a5fefa1861d0482e5f3" Nov 26 15:00:11 crc kubenswrapper[5200]: I1126 15:00:11.065653 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l4zh7" Nov 26 15:00:11 crc kubenswrapper[5200]: I1126 15:00:11.065665 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l4zh7" event={"ID":"0d16627b-f04e-40f8-97fd-66dfef695d5c","Type":"ContainerDied","Data":"f9f925bdebd2c98e40af968e172dd0496f4df6ac330887ba91a4dee4e8e243da"} Nov 26 15:00:11 crc kubenswrapper[5200]: I1126 15:00:11.065780 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9f925bdebd2c98e40af968e172dd0496f4df6ac330887ba91a4dee4e8e243da" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.075579 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"c57089b0cc90614ea2b7d202afb3706d3131737053c127ff99d3fcebdf0010ac"} Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.800220 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9q58q"] Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.801591 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24e538b2-ca3f-48bf-85ad-abf37d2cf9ab" containerName="mariadb-account-create-update" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.801616 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e538b2-ca3f-48bf-85ad-abf37d2cf9ab" containerName="mariadb-account-create-update" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.801646 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d16627b-f04e-40f8-97fd-66dfef695d5c" containerName="mariadb-database-create" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.801652 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d16627b-f04e-40f8-97fd-66dfef695d5c" containerName="mariadb-database-create" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.802022 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="24e538b2-ca3f-48bf-85ad-abf37d2cf9ab" containerName="mariadb-account-create-update" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.802095 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d16627b-f04e-40f8-97fd-66dfef695d5c" containerName="mariadb-database-create" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.821155 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9q58q"] Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.821302 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.824522 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-barbican-dockercfg-jtn64\"" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.825016 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-config-data\"" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.961579 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-combined-ca-bundle\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.961819 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-db-sync-config-data\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:12 crc kubenswrapper[5200]: I1126 15:00:12.962009 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsx7w\" (UniqueName: \"kubernetes.io/projected/3159e145-f5bf-41a3-aa10-87c4725e3781-kube-api-access-zsx7w\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:13 crc kubenswrapper[5200]: I1126 15:00:13.064123 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-combined-ca-bundle\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:13 crc kubenswrapper[5200]: I1126 15:00:13.064629 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-db-sync-config-data\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:13 crc kubenswrapper[5200]: I1126 15:00:13.064883 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsx7w\" (UniqueName: \"kubernetes.io/projected/3159e145-f5bf-41a3-aa10-87c4725e3781-kube-api-access-zsx7w\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:13 crc kubenswrapper[5200]: I1126 15:00:13.070339 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-db-sync-config-data\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:13 crc kubenswrapper[5200]: I1126 15:00:13.071216 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-combined-ca-bundle\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:13 crc kubenswrapper[5200]: I1126 15:00:13.083169 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsx7w\" (UniqueName: \"kubernetes.io/projected/3159e145-f5bf-41a3-aa10-87c4725e3781-kube-api-access-zsx7w\") pod \"barbican-db-sync-9q58q\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:13 crc kubenswrapper[5200]: I1126 15:00:13.146252 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:13 crc kubenswrapper[5200]: I1126 15:00:13.756464 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9q58q"] Nov 26 15:00:13 crc kubenswrapper[5200]: W1126 15:00:13.761582 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3159e145_f5bf_41a3_aa10_87c4725e3781.slice/crio-4ef74892a1682fb003827d5415b2226bbd87810bb68b3f2214085f0ed5bd6b3a WatchSource:0}: Error finding container 4ef74892a1682fb003827d5415b2226bbd87810bb68b3f2214085f0ed5bd6b3a: Status 404 returned error can't find the container with id 4ef74892a1682fb003827d5415b2226bbd87810bb68b3f2214085f0ed5bd6b3a Nov 26 15:00:14 crc kubenswrapper[5200]: I1126 15:00:14.103749 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9q58q" event={"ID":"3159e145-f5bf-41a3-aa10-87c4725e3781","Type":"ContainerStarted","Data":"c5ae197d5ceda2edc06e34cd13d2bb3b53ee908f900c3e2ad32e73f850242229"} Nov 26 15:00:14 crc kubenswrapper[5200]: I1126 15:00:14.104140 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9q58q" event={"ID":"3159e145-f5bf-41a3-aa10-87c4725e3781","Type":"ContainerStarted","Data":"4ef74892a1682fb003827d5415b2226bbd87810bb68b3f2214085f0ed5bd6b3a"} Nov 26 15:00:14 crc kubenswrapper[5200]: I1126 15:00:14.128516 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9q58q" podStartSLOduration=2.128497164 podStartE2EDuration="2.128497164s" podCreationTimestamp="2025-11-26 15:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:14.123639616 +0000 UTC m=+5382.802841434" watchObservedRunningTime="2025-11-26 15:00:14.128497164 +0000 UTC m=+5382.807698982" Nov 26 15:00:15 crc kubenswrapper[5200]: I1126 15:00:15.114199 5200 generic.go:358] "Generic (PLEG): container finished" podID="3159e145-f5bf-41a3-aa10-87c4725e3781" containerID="c5ae197d5ceda2edc06e34cd13d2bb3b53ee908f900c3e2ad32e73f850242229" exitCode=0 Nov 26 15:00:15 crc kubenswrapper[5200]: I1126 15:00:15.114255 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9q58q" event={"ID":"3159e145-f5bf-41a3-aa10-87c4725e3781","Type":"ContainerDied","Data":"c5ae197d5ceda2edc06e34cd13d2bb3b53ee908f900c3e2ad32e73f850242229"} Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.433584 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.527781 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-combined-ca-bundle\") pod \"3159e145-f5bf-41a3-aa10-87c4725e3781\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.527866 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-db-sync-config-data\") pod \"3159e145-f5bf-41a3-aa10-87c4725e3781\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.527918 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsx7w\" (UniqueName: \"kubernetes.io/projected/3159e145-f5bf-41a3-aa10-87c4725e3781-kube-api-access-zsx7w\") pod \"3159e145-f5bf-41a3-aa10-87c4725e3781\" (UID: \"3159e145-f5bf-41a3-aa10-87c4725e3781\") " Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.536195 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3159e145-f5bf-41a3-aa10-87c4725e3781" (UID: "3159e145-f5bf-41a3-aa10-87c4725e3781"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.536530 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3159e145-f5bf-41a3-aa10-87c4725e3781-kube-api-access-zsx7w" (OuterVolumeSpecName: "kube-api-access-zsx7w") pod "3159e145-f5bf-41a3-aa10-87c4725e3781" (UID: "3159e145-f5bf-41a3-aa10-87c4725e3781"). InnerVolumeSpecName "kube-api-access-zsx7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.555900 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3159e145-f5bf-41a3-aa10-87c4725e3781" (UID: "3159e145-f5bf-41a3-aa10-87c4725e3781"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.630308 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.630345 5200 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3159e145-f5bf-41a3-aa10-87c4725e3781-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:16 crc kubenswrapper[5200]: I1126 15:00:16.630353 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsx7w\" (UniqueName: \"kubernetes.io/projected/3159e145-f5bf-41a3-aa10-87c4725e3781-kube-api-access-zsx7w\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.131018 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9q58q" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.131022 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9q58q" event={"ID":"3159e145-f5bf-41a3-aa10-87c4725e3781","Type":"ContainerDied","Data":"4ef74892a1682fb003827d5415b2226bbd87810bb68b3f2214085f0ed5bd6b3a"} Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.131501 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ef74892a1682fb003827d5415b2226bbd87810bb68b3f2214085f0ed5bd6b3a" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.355048 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7d76b587dc-65xw5"] Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.356173 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3159e145-f5bf-41a3-aa10-87c4725e3781" containerName="barbican-db-sync" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.356191 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3159e145-f5bf-41a3-aa10-87c4725e3781" containerName="barbican-db-sync" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.356347 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3159e145-f5bf-41a3-aa10-87c4725e3781" containerName="barbican-db-sync" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.363968 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.367640 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-worker-config-data\"" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.367859 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7cbf768654-gszst"] Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.368899 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-config-data\"" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.377825 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.377643 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-barbican-dockercfg-jtn64\"" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.381316 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-keystone-listener-config-data\"" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.389293 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d76b587dc-65xw5"] Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.422903 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cbf768654-gszst"] Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.464717 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-config-data\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.464779 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-logs\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.464805 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-combined-ca-bundle\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.464851 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-config-data-custom\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.464879 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576vl\" (UniqueName: \"kubernetes.io/projected/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-kube-api-access-576vl\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.464904 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33fba1e6-7543-4be3-a405-bd2a4fe192a7-logs\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.464955 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-combined-ca-bundle\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.465033 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jznv5\" (UniqueName: \"kubernetes.io/projected/33fba1e6-7543-4be3-a405-bd2a4fe192a7-kube-api-access-jznv5\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.465085 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-config-data\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.465125 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-config-data-custom\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.506748 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7859465cfd-8zj58"] Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.512443 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ccd7b865-7m2zq"] Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.514604 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.518069 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-api-config-data\"" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.523047 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.525209 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7859465cfd-8zj58"] Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.536219 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ccd7b865-7m2zq"] Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.566535 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-nb\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.567446 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzmps\" (UniqueName: \"kubernetes.io/projected/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-kube-api-access-vzmps\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.567531 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-sb\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.567619 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jznv5\" (UniqueName: \"kubernetes.io/projected/33fba1e6-7543-4be3-a405-bd2a4fe192a7-kube-api-access-jznv5\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.567739 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-config-data\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.567803 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-logs\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.567836 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-config-data-custom\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.567909 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-config\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.567960 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-combined-ca-bundle\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568000 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-config-data\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568050 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-logs\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568075 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-combined-ca-bundle\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568108 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-dns-svc\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568152 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-config-data-custom\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568186 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-576vl\" (UniqueName: \"kubernetes.io/projected/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-kube-api-access-576vl\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568229 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33fba1e6-7543-4be3-a405-bd2a4fe192a7-logs\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568271 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-config-data\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568298 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-config-data-custom\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568340 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcz2q\" (UniqueName: \"kubernetes.io/projected/f28ec540-6c3a-456f-89ed-98f5b56910c9-kube-api-access-lcz2q\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.568383 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-combined-ca-bundle\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.574056 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-logs\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.575163 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-config-data-custom\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.576279 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33fba1e6-7543-4be3-a405-bd2a4fe192a7-logs\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.580040 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-combined-ca-bundle\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.580287 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-config-data-custom\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.591086 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-576vl\" (UniqueName: \"kubernetes.io/projected/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-kube-api-access-576vl\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.591413 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-combined-ca-bundle\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.595128 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33fba1e6-7543-4be3-a405-bd2a4fe192a7-config-data\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.595734 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1fd7ae-4ef2-4894-a7c0-ea644c68c908-config-data\") pod \"barbican-worker-7d76b587dc-65xw5\" (UID: \"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908\") " pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.595856 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jznv5\" (UniqueName: \"kubernetes.io/projected/33fba1e6-7543-4be3-a405-bd2a4fe192a7-kube-api-access-jznv5\") pod \"barbican-keystone-listener-7cbf768654-gszst\" (UID: \"33fba1e6-7543-4be3-a405-bd2a4fe192a7\") " pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.669742 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-logs\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670218 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-logs\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670227 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-config\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670336 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-combined-ca-bundle\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670417 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-dns-svc\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670512 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-config-data\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670538 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-config-data-custom\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670574 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcz2q\" (UniqueName: \"kubernetes.io/projected/f28ec540-6c3a-456f-89ed-98f5b56910c9-kube-api-access-lcz2q\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670647 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-nb\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670700 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzmps\" (UniqueName: \"kubernetes.io/projected/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-kube-api-access-vzmps\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.670732 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-sb\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.671562 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-config\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.671576 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-sb\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.672826 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-dns-svc\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.673731 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-nb\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.675048 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-combined-ca-bundle\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.675462 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-config-data-custom\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.677909 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-config-data\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.690595 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzmps\" (UniqueName: \"kubernetes.io/projected/fd4019f7-8fb5-4355-84d8-f4f76473ccb0-kube-api-access-vzmps\") pod \"barbican-api-7859465cfd-8zj58\" (UID: \"fd4019f7-8fb5-4355-84d8-f4f76473ccb0\") " pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.690758 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcz2q\" (UniqueName: \"kubernetes.io/projected/f28ec540-6c3a-456f-89ed-98f5b56910c9-kube-api-access-lcz2q\") pod \"dnsmasq-dns-54ccd7b865-7m2zq\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.696980 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d76b587dc-65xw5" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.722077 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cbf768654-gszst" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.833135 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:17 crc kubenswrapper[5200]: I1126 15:00:17.851258 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:18 crc kubenswrapper[5200]: I1126 15:00:18.193769 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cbf768654-gszst"] Nov 26 15:00:18 crc kubenswrapper[5200]: I1126 15:00:18.270758 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d76b587dc-65xw5"] Nov 26 15:00:18 crc kubenswrapper[5200]: W1126 15:00:18.282782 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac1fd7ae_4ef2_4894_a7c0_ea644c68c908.slice/crio-e05f5619cc4a82242fbfee63372ebdc5975dadcb0af9d8c9eea1e11e5a6e029e WatchSource:0}: Error finding container e05f5619cc4a82242fbfee63372ebdc5975dadcb0af9d8c9eea1e11e5a6e029e: Status 404 returned error can't find the container with id e05f5619cc4a82242fbfee63372ebdc5975dadcb0af9d8c9eea1e11e5a6e029e Nov 26 15:00:18 crc kubenswrapper[5200]: I1126 15:00:18.366925 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7859465cfd-8zj58"] Nov 26 15:00:18 crc kubenswrapper[5200]: W1126 15:00:18.374794 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd4019f7_8fb5_4355_84d8_f4f76473ccb0.slice/crio-222ba1fe59a6cca7853ad4d0259487bbb736b981fa0615492ba903c09c0fff10 WatchSource:0}: Error finding container 222ba1fe59a6cca7853ad4d0259487bbb736b981fa0615492ba903c09c0fff10: Status 404 returned error can't find the container with id 222ba1fe59a6cca7853ad4d0259487bbb736b981fa0615492ba903c09c0fff10 Nov 26 15:00:18 crc kubenswrapper[5200]: I1126 15:00:18.443781 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ccd7b865-7m2zq"] Nov 26 15:00:18 crc kubenswrapper[5200]: W1126 15:00:18.464923 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf28ec540_6c3a_456f_89ed_98f5b56910c9.slice/crio-eba8ab6edf68347d082f36d61970e1b8c4af5a193a5febb61450662d368e1807 WatchSource:0}: Error finding container eba8ab6edf68347d082f36d61970e1b8c4af5a193a5febb61450662d368e1807: Status 404 returned error can't find the container with id eba8ab6edf68347d082f36d61970e1b8c4af5a193a5febb61450662d368e1807 Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.151521 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cbf768654-gszst" event={"ID":"33fba1e6-7543-4be3-a405-bd2a4fe192a7","Type":"ContainerStarted","Data":"2c7b30d389f53a5a9cefd561c4c1372cbb60ae25fa1058631ca3698f861d7d3e"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.151944 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cbf768654-gszst" event={"ID":"33fba1e6-7543-4be3-a405-bd2a4fe192a7","Type":"ContainerStarted","Data":"e8b71c095e4e8a1ddf5c52e59c32fdfe8958e5c87e88dc39457565aebf42b319"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.151963 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cbf768654-gszst" event={"ID":"33fba1e6-7543-4be3-a405-bd2a4fe192a7","Type":"ContainerStarted","Data":"64b5e7649d947ede22cb628f3395cd17d3e384fcc2127b6d4b1bc0329b4e9626"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.154404 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7859465cfd-8zj58" event={"ID":"fd4019f7-8fb5-4355-84d8-f4f76473ccb0","Type":"ContainerStarted","Data":"7355cde65af872dc649f4178a4c9223ce1f45d258894107b658b5c16fede5487"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.154456 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7859465cfd-8zj58" event={"ID":"fd4019f7-8fb5-4355-84d8-f4f76473ccb0","Type":"ContainerStarted","Data":"9bca3571b39925e6c7171273204a128e251151019eb76e1ce5ba9b98c43c0304"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.154468 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7859465cfd-8zj58" event={"ID":"fd4019f7-8fb5-4355-84d8-f4f76473ccb0","Type":"ContainerStarted","Data":"222ba1fe59a6cca7853ad4d0259487bbb736b981fa0615492ba903c09c0fff10"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.154510 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.156788 5200 generic.go:358] "Generic (PLEG): container finished" podID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerID="04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34" exitCode=0 Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.156826 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" event={"ID":"f28ec540-6c3a-456f-89ed-98f5b56910c9","Type":"ContainerDied","Data":"04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.156872 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" event={"ID":"f28ec540-6c3a-456f-89ed-98f5b56910c9","Type":"ContainerStarted","Data":"eba8ab6edf68347d082f36d61970e1b8c4af5a193a5febb61450662d368e1807"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.159494 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d76b587dc-65xw5" event={"ID":"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908","Type":"ContainerStarted","Data":"647b028bbf47207969d01d8a7e42bdcfb25e4342cf0b97ebfc8141b994ce9d0d"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.159592 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d76b587dc-65xw5" event={"ID":"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908","Type":"ContainerStarted","Data":"f31189987b69148e667bb5a3dda4f35eb6606b3d024f056c3160be92e07757cf"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.159626 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d76b587dc-65xw5" event={"ID":"ac1fd7ae-4ef2-4894-a7c0-ea644c68c908","Type":"ContainerStarted","Data":"e05f5619cc4a82242fbfee63372ebdc5975dadcb0af9d8c9eea1e11e5a6e029e"} Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.202567 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7cbf768654-gszst" podStartSLOduration=2.202549516 podStartE2EDuration="2.202549516s" podCreationTimestamp="2025-11-26 15:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:19.193698292 +0000 UTC m=+5387.872900110" watchObservedRunningTime="2025-11-26 15:00:19.202549516 +0000 UTC m=+5387.881751324" Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.303954 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7859465cfd-8zj58" podStartSLOduration=2.303919332 podStartE2EDuration="2.303919332s" podCreationTimestamp="2025-11-26 15:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:19.258367959 +0000 UTC m=+5387.937569787" watchObservedRunningTime="2025-11-26 15:00:19.303919332 +0000 UTC m=+5387.983121150" Nov 26 15:00:19 crc kubenswrapper[5200]: I1126 15:00:19.306050 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7d76b587dc-65xw5" podStartSLOduration=2.306043648 podStartE2EDuration="2.306043648s" podCreationTimestamp="2025-11-26 15:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:19.286737628 +0000 UTC m=+5387.965939466" watchObservedRunningTime="2025-11-26 15:00:19.306043648 +0000 UTC m=+5387.985245466" Nov 26 15:00:20 crc kubenswrapper[5200]: I1126 15:00:20.172454 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" event={"ID":"f28ec540-6c3a-456f-89ed-98f5b56910c9","Type":"ContainerStarted","Data":"75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af"} Nov 26 15:00:20 crc kubenswrapper[5200]: I1126 15:00:20.174510 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:20 crc kubenswrapper[5200]: I1126 15:00:20.174561 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:20 crc kubenswrapper[5200]: I1126 15:00:20.195493 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" podStartSLOduration=3.195461975 podStartE2EDuration="3.195461975s" podCreationTimestamp="2025-11-26 15:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:20.191924782 +0000 UTC m=+5388.871126660" watchObservedRunningTime="2025-11-26 15:00:20.195461975 +0000 UTC m=+5388.874663863" Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.188893 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.283340 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d4bf4bdcc-4k76c"] Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.283642 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" podUID="fbd62451-f99b-4030-a542-d749bbff084d" containerName="dnsmasq-dns" containerID="cri-o://32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d" gracePeriod=10 Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.709807 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.753804 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7859465cfd-8zj58" Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.836925 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.970980 5200 scope.go:117] "RemoveContainer" containerID="1bad2ab490a2ef5d461b78889fc1f6972d9d09f4243c41b24b2ade67bd2f181e" Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.992089 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-nb\") pod \"fbd62451-f99b-4030-a542-d749bbff084d\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.992155 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbf5n\" (UniqueName: \"kubernetes.io/projected/fbd62451-f99b-4030-a542-d749bbff084d-kube-api-access-hbf5n\") pod \"fbd62451-f99b-4030-a542-d749bbff084d\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.992426 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-sb\") pod \"fbd62451-f99b-4030-a542-d749bbff084d\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.992452 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-config\") pod \"fbd62451-f99b-4030-a542-d749bbff084d\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " Nov 26 15:00:27 crc kubenswrapper[5200]: I1126 15:00:27.992655 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-dns-svc\") pod \"fbd62451-f99b-4030-a542-d749bbff084d\" (UID: \"fbd62451-f99b-4030-a542-d749bbff084d\") " Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.003396 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd62451-f99b-4030-a542-d749bbff084d-kube-api-access-hbf5n" (OuterVolumeSpecName: "kube-api-access-hbf5n") pod "fbd62451-f99b-4030-a542-d749bbff084d" (UID: "fbd62451-f99b-4030-a542-d749bbff084d"). InnerVolumeSpecName "kube-api-access-hbf5n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.050963 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fbd62451-f99b-4030-a542-d749bbff084d" (UID: "fbd62451-f99b-4030-a542-d749bbff084d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.052535 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-config" (OuterVolumeSpecName: "config") pod "fbd62451-f99b-4030-a542-d749bbff084d" (UID: "fbd62451-f99b-4030-a542-d749bbff084d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.059042 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fbd62451-f99b-4030-a542-d749bbff084d" (UID: "fbd62451-f99b-4030-a542-d749bbff084d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.065186 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fbd62451-f99b-4030-a542-d749bbff084d" (UID: "fbd62451-f99b-4030-a542-d749bbff084d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.095287 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.095320 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hbf5n\" (UniqueName: \"kubernetes.io/projected/fbd62451-f99b-4030-a542-d749bbff084d-kube-api-access-hbf5n\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.095335 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.095345 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.095354 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbd62451-f99b-4030-a542-d749bbff084d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.302872 5200 generic.go:358] "Generic (PLEG): container finished" podID="fbd62451-f99b-4030-a542-d749bbff084d" containerID="32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d" exitCode=0 Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.304724 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" event={"ID":"fbd62451-f99b-4030-a542-d749bbff084d","Type":"ContainerDied","Data":"32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d"} Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.304812 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" event={"ID":"fbd62451-f99b-4030-a542-d749bbff084d","Type":"ContainerDied","Data":"a2cada01148cbc68555a5c8de04c1afb78903fc60e738d0f1059e52b394a2cc7"} Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.304848 5200 scope.go:117] "RemoveContainer" containerID="32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.304964 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4bf4bdcc-4k76c" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.334739 5200 scope.go:117] "RemoveContainer" containerID="00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.358014 5200 scope.go:117] "RemoveContainer" containerID="32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d" Nov 26 15:00:28 crc kubenswrapper[5200]: E1126 15:00:28.359635 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d\": container with ID starting with 32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d not found: ID does not exist" containerID="32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.359700 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d"} err="failed to get container status \"32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d\": rpc error: code = NotFound desc = could not find container \"32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d\": container with ID starting with 32691e81b40cff0255bdf08e7a5b3fabbb35a5f399624cadc1577b6a00d9321d not found: ID does not exist" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.359729 5200 scope.go:117] "RemoveContainer" containerID="00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a" Nov 26 15:00:28 crc kubenswrapper[5200]: E1126 15:00:28.361509 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a\": container with ID starting with 00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a not found: ID does not exist" containerID="00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.361611 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a"} err="failed to get container status \"00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a\": rpc error: code = NotFound desc = could not find container \"00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a\": container with ID starting with 00c6f8e53443379cffd2c09f8bf06125321a323deddaa5cc28049bfa9f688a8a not found: ID does not exist" Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.365461 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d4bf4bdcc-4k76c"] Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.377293 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d4bf4bdcc-4k76c"] Nov 26 15:00:28 crc kubenswrapper[5200]: I1126 15:00:28.991935 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd62451-f99b-4030-a542-d749bbff084d" path="/var/lib/kubelet/pods/fbd62451-f99b-4030-a542-d749bbff084d/volumes" Nov 26 15:00:37 crc kubenswrapper[5200]: I1126 15:00:37.177874 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:00:37 crc kubenswrapper[5200]: I1126 15:00:37.177936 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:00:37 crc kubenswrapper[5200]: I1126 15:00:37.186530 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:00:37 crc kubenswrapper[5200]: I1126 15:00:37.187166 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.159026 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tnfxv"] Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.161203 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbd62451-f99b-4030-a542-d749bbff084d" containerName="dnsmasq-dns" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.161220 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd62451-f99b-4030-a542-d749bbff084d" containerName="dnsmasq-dns" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.161249 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbd62451-f99b-4030-a542-d749bbff084d" containerName="init" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.161254 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd62451-f99b-4030-a542-d749bbff084d" containerName="init" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.161448 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbd62451-f99b-4030-a542-d749bbff084d" containerName="dnsmasq-dns" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.166913 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.182396 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tnfxv"] Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.198160 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-operator-scripts\") pod \"neutron-db-create-tnfxv\" (UID: \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\") " pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.198271 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5nbc\" (UniqueName: \"kubernetes.io/projected/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-kube-api-access-k5nbc\") pod \"neutron-db-create-tnfxv\" (UID: \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\") " pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.251663 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-2f2d-account-create-update-rdf8p"] Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.258344 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.261977 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-db-secret\"" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.267192 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2f2d-account-create-update-rdf8p"] Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.300665 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-operator-scripts\") pod \"neutron-db-create-tnfxv\" (UID: \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\") " pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.300864 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5nbc\" (UniqueName: \"kubernetes.io/projected/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-kube-api-access-k5nbc\") pod \"neutron-db-create-tnfxv\" (UID: \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\") " pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.301151 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtm8c\" (UniqueName: \"kubernetes.io/projected/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-kube-api-access-gtm8c\") pod \"neutron-2f2d-account-create-update-rdf8p\" (UID: \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\") " pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.301210 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-operator-scripts\") pod \"neutron-2f2d-account-create-update-rdf8p\" (UID: \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\") " pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.301782 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-operator-scripts\") pod \"neutron-db-create-tnfxv\" (UID: \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\") " pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.331894 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5nbc\" (UniqueName: \"kubernetes.io/projected/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-kube-api-access-k5nbc\") pod \"neutron-db-create-tnfxv\" (UID: \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\") " pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.403465 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtm8c\" (UniqueName: \"kubernetes.io/projected/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-kube-api-access-gtm8c\") pod \"neutron-2f2d-account-create-update-rdf8p\" (UID: \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\") " pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.403540 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-operator-scripts\") pod \"neutron-2f2d-account-create-update-rdf8p\" (UID: \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\") " pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.404659 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-operator-scripts\") pod \"neutron-2f2d-account-create-update-rdf8p\" (UID: \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\") " pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.434737 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtm8c\" (UniqueName: \"kubernetes.io/projected/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-kube-api-access-gtm8c\") pod \"neutron-2f2d-account-create-update-rdf8p\" (UID: \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\") " pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.499546 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.617800 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:40 crc kubenswrapper[5200]: I1126 15:00:40.799022 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tnfxv"] Nov 26 15:00:41 crc kubenswrapper[5200]: I1126 15:00:41.097362 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2f2d-account-create-update-rdf8p"] Nov 26 15:00:41 crc kubenswrapper[5200]: W1126 15:00:41.101926 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28b41a86_1404_4554_9b24_7c8eaa6f0ac9.slice/crio-89920faa38efc0d50390dadd7a65093a3250dab00af0e569ace3cb7ce36f1de9 WatchSource:0}: Error finding container 89920faa38efc0d50390dadd7a65093a3250dab00af0e569ace3cb7ce36f1de9: Status 404 returned error can't find the container with id 89920faa38efc0d50390dadd7a65093a3250dab00af0e569ace3cb7ce36f1de9 Nov 26 15:00:41 crc kubenswrapper[5200]: I1126 15:00:41.476567 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2f2d-account-create-update-rdf8p" event={"ID":"28b41a86-1404-4554-9b24-7c8eaa6f0ac9","Type":"ContainerStarted","Data":"d6f4bd23875b61dc826f2579db44df7a38ac0145070b9a05418474889fe09146"} Nov 26 15:00:41 crc kubenswrapper[5200]: I1126 15:00:41.476623 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2f2d-account-create-update-rdf8p" event={"ID":"28b41a86-1404-4554-9b24-7c8eaa6f0ac9","Type":"ContainerStarted","Data":"89920faa38efc0d50390dadd7a65093a3250dab00af0e569ace3cb7ce36f1de9"} Nov 26 15:00:41 crc kubenswrapper[5200]: I1126 15:00:41.481921 5200 generic.go:358] "Generic (PLEG): container finished" podID="b22c5306-1d48-4995-83ca-b6fc66bd3b1f" containerID="e43f3e0a42eb44cc325c4c9c25910365ef5b6b62cdb4b48779899fc76ad9aaa4" exitCode=0 Nov 26 15:00:41 crc kubenswrapper[5200]: I1126 15:00:41.482044 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tnfxv" event={"ID":"b22c5306-1d48-4995-83ca-b6fc66bd3b1f","Type":"ContainerDied","Data":"e43f3e0a42eb44cc325c4c9c25910365ef5b6b62cdb4b48779899fc76ad9aaa4"} Nov 26 15:00:41 crc kubenswrapper[5200]: I1126 15:00:41.482085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tnfxv" event={"ID":"b22c5306-1d48-4995-83ca-b6fc66bd3b1f","Type":"ContainerStarted","Data":"b23c2331e73a26c7beb82ccac3c0b7e06db0a3dce452111aa8daa85959d6a0bd"} Nov 26 15:00:41 crc kubenswrapper[5200]: I1126 15:00:41.514457 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-2f2d-account-create-update-rdf8p" podStartSLOduration=1.514431619 podStartE2EDuration="1.514431619s" podCreationTimestamp="2025-11-26 15:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:41.49967995 +0000 UTC m=+5410.178881778" watchObservedRunningTime="2025-11-26 15:00:41.514431619 +0000 UTC m=+5410.193633437" Nov 26 15:00:42 crc kubenswrapper[5200]: I1126 15:00:42.500668 5200 generic.go:358] "Generic (PLEG): container finished" podID="28b41a86-1404-4554-9b24-7c8eaa6f0ac9" containerID="d6f4bd23875b61dc826f2579db44df7a38ac0145070b9a05418474889fe09146" exitCode=0 Nov 26 15:00:42 crc kubenswrapper[5200]: I1126 15:00:42.500973 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2f2d-account-create-update-rdf8p" event={"ID":"28b41a86-1404-4554-9b24-7c8eaa6f0ac9","Type":"ContainerDied","Data":"d6f4bd23875b61dc826f2579db44df7a38ac0145070b9a05418474889fe09146"} Nov 26 15:00:42 crc kubenswrapper[5200]: E1126 15:00:42.617571 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1584062 actualBytes=10240 Nov 26 15:00:42 crc kubenswrapper[5200]: I1126 15:00:42.857076 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:42 crc kubenswrapper[5200]: I1126 15:00:42.962649 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-operator-scripts\") pod \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\" (UID: \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\") " Nov 26 15:00:42 crc kubenswrapper[5200]: I1126 15:00:42.963137 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5nbc\" (UniqueName: \"kubernetes.io/projected/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-kube-api-access-k5nbc\") pod \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\" (UID: \"b22c5306-1d48-4995-83ca-b6fc66bd3b1f\") " Nov 26 15:00:42 crc kubenswrapper[5200]: I1126 15:00:42.963351 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b22c5306-1d48-4995-83ca-b6fc66bd3b1f" (UID: "b22c5306-1d48-4995-83ca-b6fc66bd3b1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:42 crc kubenswrapper[5200]: I1126 15:00:42.963937 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:42 crc kubenswrapper[5200]: I1126 15:00:42.970988 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-kube-api-access-k5nbc" (OuterVolumeSpecName: "kube-api-access-k5nbc") pod "b22c5306-1d48-4995-83ca-b6fc66bd3b1f" (UID: "b22c5306-1d48-4995-83ca-b6fc66bd3b1f"). InnerVolumeSpecName "kube-api-access-k5nbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:43 crc kubenswrapper[5200]: I1126 15:00:43.066915 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5nbc\" (UniqueName: \"kubernetes.io/projected/b22c5306-1d48-4995-83ca-b6fc66bd3b1f-kube-api-access-k5nbc\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:43 crc kubenswrapper[5200]: I1126 15:00:43.519082 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tnfxv" event={"ID":"b22c5306-1d48-4995-83ca-b6fc66bd3b1f","Type":"ContainerDied","Data":"b23c2331e73a26c7beb82ccac3c0b7e06db0a3dce452111aa8daa85959d6a0bd"} Nov 26 15:00:43 crc kubenswrapper[5200]: I1126 15:00:43.519138 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tnfxv" Nov 26 15:00:43 crc kubenswrapper[5200]: I1126 15:00:43.519164 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b23c2331e73a26c7beb82ccac3c0b7e06db0a3dce452111aa8daa85959d6a0bd" Nov 26 15:00:43 crc kubenswrapper[5200]: I1126 15:00:43.968062 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.090606 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtm8c\" (UniqueName: \"kubernetes.io/projected/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-kube-api-access-gtm8c\") pod \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\" (UID: \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\") " Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.091159 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-operator-scripts\") pod \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\" (UID: \"28b41a86-1404-4554-9b24-7c8eaa6f0ac9\") " Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.092671 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28b41a86-1404-4554-9b24-7c8eaa6f0ac9" (UID: "28b41a86-1404-4554-9b24-7c8eaa6f0ac9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.099952 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-kube-api-access-gtm8c" (OuterVolumeSpecName: "kube-api-access-gtm8c") pod "28b41a86-1404-4554-9b24-7c8eaa6f0ac9" (UID: "28b41a86-1404-4554-9b24-7c8eaa6f0ac9"). InnerVolumeSpecName "kube-api-access-gtm8c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.193281 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.193321 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gtm8c\" (UniqueName: \"kubernetes.io/projected/28b41a86-1404-4554-9b24-7c8eaa6f0ac9-kube-api-access-gtm8c\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.550100 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2f2d-account-create-update-rdf8p" event={"ID":"28b41a86-1404-4554-9b24-7c8eaa6f0ac9","Type":"ContainerDied","Data":"89920faa38efc0d50390dadd7a65093a3250dab00af0e569ace3cb7ce36f1de9"} Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.550190 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89920faa38efc0d50390dadd7a65093a3250dab00af0e569ace3cb7ce36f1de9" Nov 26 15:00:44 crc kubenswrapper[5200]: I1126 15:00:44.550333 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2f2d-account-create-update-rdf8p" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.474004 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wm5bt"] Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.475403 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b22c5306-1d48-4995-83ca-b6fc66bd3b1f" containerName="mariadb-database-create" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.475510 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b22c5306-1d48-4995-83ca-b6fc66bd3b1f" containerName="mariadb-database-create" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.475607 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28b41a86-1404-4554-9b24-7c8eaa6f0ac9" containerName="mariadb-account-create-update" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.475617 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b41a86-1404-4554-9b24-7c8eaa6f0ac9" containerName="mariadb-account-create-update" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.475793 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="28b41a86-1404-4554-9b24-7c8eaa6f0ac9" containerName="mariadb-account-create-update" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.475817 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b22c5306-1d48-4995-83ca-b6fc66bd3b1f" containerName="mariadb-database-create" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.482504 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.487360 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-httpd-config\"" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.487695 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-config\"" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.491220 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-neutron-dockercfg-brhds\"" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.528896 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wm5bt"] Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.632265 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-combined-ca-bundle\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.632359 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkhxw\" (UniqueName: \"kubernetes.io/projected/64ab3495-0e54-4181-a051-e292abef15a9-kube-api-access-fkhxw\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.632417 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-config\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.737913 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkhxw\" (UniqueName: \"kubernetes.io/projected/64ab3495-0e54-4181-a051-e292abef15a9-kube-api-access-fkhxw\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.738357 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-config\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.740464 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-combined-ca-bundle\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.747709 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-combined-ca-bundle\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.747834 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-config\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.755652 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkhxw\" (UniqueName: \"kubernetes.io/projected/64ab3495-0e54-4181-a051-e292abef15a9-kube-api-access-fkhxw\") pod \"neutron-db-sync-wm5bt\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:45 crc kubenswrapper[5200]: I1126 15:00:45.855469 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:46 crc kubenswrapper[5200]: I1126 15:00:46.347508 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wm5bt"] Nov 26 15:00:46 crc kubenswrapper[5200]: I1126 15:00:46.585359 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wm5bt" event={"ID":"64ab3495-0e54-4181-a051-e292abef15a9","Type":"ContainerStarted","Data":"9931630bf83f1697c7ebf92e5a78f71e880d9c313bc53c5dc8a5f23c2a3c159d"} Nov 26 15:00:47 crc kubenswrapper[5200]: I1126 15:00:47.599308 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wm5bt" event={"ID":"64ab3495-0e54-4181-a051-e292abef15a9","Type":"ContainerStarted","Data":"81c362e05144c036d596398dcd8fb84c2037aba9ee33addd0ef4714b0cbcf99d"} Nov 26 15:00:47 crc kubenswrapper[5200]: I1126 15:00:47.621848 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wm5bt" podStartSLOduration=2.621827578 podStartE2EDuration="2.621827578s" podCreationTimestamp="2025-11-26 15:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:47.613517169 +0000 UTC m=+5416.292719007" watchObservedRunningTime="2025-11-26 15:00:47.621827578 +0000 UTC m=+5416.301029396" Nov 26 15:00:50 crc kubenswrapper[5200]: I1126 15:00:50.639213 5200 generic.go:358] "Generic (PLEG): container finished" podID="64ab3495-0e54-4181-a051-e292abef15a9" containerID="81c362e05144c036d596398dcd8fb84c2037aba9ee33addd0ef4714b0cbcf99d" exitCode=0 Nov 26 15:00:50 crc kubenswrapper[5200]: I1126 15:00:50.639325 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wm5bt" event={"ID":"64ab3495-0e54-4181-a051-e292abef15a9","Type":"ContainerDied","Data":"81c362e05144c036d596398dcd8fb84c2037aba9ee33addd0ef4714b0cbcf99d"} Nov 26 15:00:51 crc kubenswrapper[5200]: I1126 15:00:51.976972 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.077835 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-config\") pod \"64ab3495-0e54-4181-a051-e292abef15a9\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.077991 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-combined-ca-bundle\") pod \"64ab3495-0e54-4181-a051-e292abef15a9\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.078069 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkhxw\" (UniqueName: \"kubernetes.io/projected/64ab3495-0e54-4181-a051-e292abef15a9-kube-api-access-fkhxw\") pod \"64ab3495-0e54-4181-a051-e292abef15a9\" (UID: \"64ab3495-0e54-4181-a051-e292abef15a9\") " Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.086785 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ab3495-0e54-4181-a051-e292abef15a9-kube-api-access-fkhxw" (OuterVolumeSpecName: "kube-api-access-fkhxw") pod "64ab3495-0e54-4181-a051-e292abef15a9" (UID: "64ab3495-0e54-4181-a051-e292abef15a9"). InnerVolumeSpecName "kube-api-access-fkhxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.104775 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64ab3495-0e54-4181-a051-e292abef15a9" (UID: "64ab3495-0e54-4181-a051-e292abef15a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.113854 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-config" (OuterVolumeSpecName: "config") pod "64ab3495-0e54-4181-a051-e292abef15a9" (UID: "64ab3495-0e54-4181-a051-e292abef15a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.180090 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.180125 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ab3495-0e54-4181-a051-e292abef15a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.180137 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkhxw\" (UniqueName: \"kubernetes.io/projected/64ab3495-0e54-4181-a051-e292abef15a9-kube-api-access-fkhxw\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.671440 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wm5bt" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.671468 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wm5bt" event={"ID":"64ab3495-0e54-4181-a051-e292abef15a9","Type":"ContainerDied","Data":"9931630bf83f1697c7ebf92e5a78f71e880d9c313bc53c5dc8a5f23c2a3c159d"} Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.672224 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9931630bf83f1697c7ebf92e5a78f71e880d9c313bc53c5dc8a5f23c2a3c159d" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.928574 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdcdf5569-szn2x"] Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.930057 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64ab3495-0e54-4181-a051-e292abef15a9" containerName="neutron-db-sync" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.930079 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ab3495-0e54-4181-a051-e292abef15a9" containerName="neutron-db-sync" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.930258 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="64ab3495-0e54-4181-a051-e292abef15a9" containerName="neutron-db-sync" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.938455 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.957363 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdcdf5569-szn2x"] Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.999730 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjpp\" (UniqueName: \"kubernetes.io/projected/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-kube-api-access-lxjpp\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:52 crc kubenswrapper[5200]: I1126 15:00:52.999785 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-config\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:52.999805 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:52.999917 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-dns-svc\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:52.999938 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.018021 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d9c6594ff-hl6n8"] Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.026179 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.030970 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-neutron-dockercfg-brhds\"" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.030988 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-httpd-config\"" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.031599 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-config\"" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.039741 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d9c6594ff-hl6n8"] Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.102247 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-config\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.102389 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbx66\" (UniqueName: \"kubernetes.io/projected/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-kube-api-access-sbx66\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.102431 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-dns-svc\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.102462 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.102490 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-combined-ca-bundle\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.102887 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-httpd-config\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.103283 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjpp\" (UniqueName: \"kubernetes.io/projected/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-kube-api-access-lxjpp\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.103384 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-config\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.103433 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.103481 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-dns-svc\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.103617 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.104125 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-config\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.104812 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.123283 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjpp\" (UniqueName: \"kubernetes.io/projected/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-kube-api-access-lxjpp\") pod \"dnsmasq-dns-5fdcdf5569-szn2x\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.204931 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-config\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.205054 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbx66\" (UniqueName: \"kubernetes.io/projected/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-kube-api-access-sbx66\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.205090 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-combined-ca-bundle\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.205160 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-httpd-config\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.210783 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-config\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.218500 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-httpd-config\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.228771 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-combined-ca-bundle\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.233334 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbx66\" (UniqueName: \"kubernetes.io/projected/fc2a4c62-a08e-4442-a3c3-b26e0360b30a-kube-api-access-sbx66\") pod \"neutron-7d9c6594ff-hl6n8\" (UID: \"fc2a4c62-a08e-4442-a3c3-b26e0360b30a\") " pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.267025 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.350287 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:53 crc kubenswrapper[5200]: I1126 15:00:53.917388 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdcdf5569-szn2x"] Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.070876 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d9c6594ff-hl6n8"] Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.694965 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d9c6594ff-hl6n8" event={"ID":"fc2a4c62-a08e-4442-a3c3-b26e0360b30a","Type":"ContainerStarted","Data":"cdd723c72695d74ed4a27439203d18cc37dad6d81be6f9692909f3eb582d0957"} Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.695373 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d9c6594ff-hl6n8" event={"ID":"fc2a4c62-a08e-4442-a3c3-b26e0360b30a","Type":"ContainerStarted","Data":"606d13e47cb8a5db7e92e62e13c2c54ddc5fde270704ee7f4954b6286ddff7d2"} Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.695389 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d9c6594ff-hl6n8" event={"ID":"fc2a4c62-a08e-4442-a3c3-b26e0360b30a","Type":"ContainerStarted","Data":"e94f8720997c68f4ed45bc8aa5453be12c5e8e4c6df9d9ca63ccb9bc7e4150d2"} Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.695405 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.696733 5200 generic.go:358] "Generic (PLEG): container finished" podID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" containerID="09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55" exitCode=0 Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.696827 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" event={"ID":"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f","Type":"ContainerDied","Data":"09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55"} Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.696852 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" event={"ID":"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f","Type":"ContainerStarted","Data":"21ecc4c7cb58a99564532bae6e97da3fa50d1dd50025edfca2d538a2f55deb8f"} Nov 26 15:00:54 crc kubenswrapper[5200]: I1126 15:00:54.732198 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d9c6594ff-hl6n8" podStartSLOduration=2.732174792 podStartE2EDuration="2.732174792s" podCreationTimestamp="2025-11-26 15:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:54.717074564 +0000 UTC m=+5423.396276402" watchObservedRunningTime="2025-11-26 15:00:54.732174792 +0000 UTC m=+5423.411376610" Nov 26 15:00:55 crc kubenswrapper[5200]: I1126 15:00:55.711296 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" event={"ID":"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f","Type":"ContainerStarted","Data":"fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9"} Nov 26 15:00:55 crc kubenswrapper[5200]: I1126 15:00:55.712060 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:00:55 crc kubenswrapper[5200]: I1126 15:00:55.736638 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" podStartSLOduration=3.736616107 podStartE2EDuration="3.736616107s" podCreationTimestamp="2025-11-26 15:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:55.733026352 +0000 UTC m=+5424.412228170" watchObservedRunningTime="2025-11-26 15:00:55.736616107 +0000 UTC m=+5424.415817925" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.132277 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29402821-wtjlz"] Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.143046 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.202257 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29402821-wtjlz"] Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.286885 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-config-data\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.287001 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9npg\" (UniqueName: \"kubernetes.io/projected/23c934e2-7b5a-4567-aac8-b8ffea246b3d-kube-api-access-z9npg\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.287036 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-fernet-keys\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.287123 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-combined-ca-bundle\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.389095 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-combined-ca-bundle\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.389177 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-config-data\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.389289 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9npg\" (UniqueName: \"kubernetes.io/projected/23c934e2-7b5a-4567-aac8-b8ffea246b3d-kube-api-access-z9npg\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.389324 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-fernet-keys\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.407975 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-fernet-keys\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.408256 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-config-data\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.409733 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-combined-ca-bundle\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.411456 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9npg\" (UniqueName: \"kubernetes.io/projected/23c934e2-7b5a-4567-aac8-b8ffea246b3d-kube-api-access-z9npg\") pod \"keystone-cron-29402821-wtjlz\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.481870 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:00 crc kubenswrapper[5200]: I1126 15:01:00.979248 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29402821-wtjlz"] Nov 26 15:01:01 crc kubenswrapper[5200]: I1126 15:01:01.725978 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:01:01 crc kubenswrapper[5200]: I1126 15:01:01.787436 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402821-wtjlz" event={"ID":"23c934e2-7b5a-4567-aac8-b8ffea246b3d","Type":"ContainerStarted","Data":"6ade38ec45169e88cab8343560e164383782d471569d7815e3b23c0b3b0f1799"} Nov 26 15:01:01 crc kubenswrapper[5200]: I1126 15:01:01.787486 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402821-wtjlz" event={"ID":"23c934e2-7b5a-4567-aac8-b8ffea246b3d","Type":"ContainerStarted","Data":"ac34439a2906cc9dc5a0f2e06c9e23050ac87fe926f4385b13f93ced51970116"} Nov 26 15:01:01 crc kubenswrapper[5200]: I1126 15:01:01.822676 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54ccd7b865-7m2zq"] Nov 26 15:01:01 crc kubenswrapper[5200]: I1126 15:01:01.823083 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" podUID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerName="dnsmasq-dns" containerID="cri-o://75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af" gracePeriod=10 Nov 26 15:01:01 crc kubenswrapper[5200]: I1126 15:01:01.836832 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29402821-wtjlz" podStartSLOduration=1.836810486 podStartE2EDuration="1.836810486s" podCreationTimestamp="2025-11-26 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:01.810188073 +0000 UTC m=+5430.489389891" watchObservedRunningTime="2025-11-26 15:01:01.836810486 +0000 UTC m=+5430.516012294" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.324020 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.444169 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-dns-svc\") pod \"f28ec540-6c3a-456f-89ed-98f5b56910c9\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.444536 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-config\") pod \"f28ec540-6c3a-456f-89ed-98f5b56910c9\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.444596 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcz2q\" (UniqueName: \"kubernetes.io/projected/f28ec540-6c3a-456f-89ed-98f5b56910c9-kube-api-access-lcz2q\") pod \"f28ec540-6c3a-456f-89ed-98f5b56910c9\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.444652 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-sb\") pod \"f28ec540-6c3a-456f-89ed-98f5b56910c9\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.444791 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-nb\") pod \"f28ec540-6c3a-456f-89ed-98f5b56910c9\" (UID: \"f28ec540-6c3a-456f-89ed-98f5b56910c9\") " Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.455266 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28ec540-6c3a-456f-89ed-98f5b56910c9-kube-api-access-lcz2q" (OuterVolumeSpecName: "kube-api-access-lcz2q") pod "f28ec540-6c3a-456f-89ed-98f5b56910c9" (UID: "f28ec540-6c3a-456f-89ed-98f5b56910c9"). InnerVolumeSpecName "kube-api-access-lcz2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.494585 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f28ec540-6c3a-456f-89ed-98f5b56910c9" (UID: "f28ec540-6c3a-456f-89ed-98f5b56910c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.496444 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-config" (OuterVolumeSpecName: "config") pod "f28ec540-6c3a-456f-89ed-98f5b56910c9" (UID: "f28ec540-6c3a-456f-89ed-98f5b56910c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.503844 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f28ec540-6c3a-456f-89ed-98f5b56910c9" (UID: "f28ec540-6c3a-456f-89ed-98f5b56910c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.506103 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f28ec540-6c3a-456f-89ed-98f5b56910c9" (UID: "f28ec540-6c3a-456f-89ed-98f5b56910c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.547748 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.547799 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.547810 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lcz2q\" (UniqueName: \"kubernetes.io/projected/f28ec540-6c3a-456f-89ed-98f5b56910c9-kube-api-access-lcz2q\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.547825 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.547834 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f28ec540-6c3a-456f-89ed-98f5b56910c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.802608 5200 generic.go:358] "Generic (PLEG): container finished" podID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerID="75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af" exitCode=0 Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.802740 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.802789 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" event={"ID":"f28ec540-6c3a-456f-89ed-98f5b56910c9","Type":"ContainerDied","Data":"75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af"} Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.802903 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" event={"ID":"f28ec540-6c3a-456f-89ed-98f5b56910c9","Type":"ContainerDied","Data":"eba8ab6edf68347d082f36d61970e1b8c4af5a193a5febb61450662d368e1807"} Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.802955 5200 scope.go:117] "RemoveContainer" containerID="75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.847971 5200 scope.go:117] "RemoveContainer" containerID="04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.857437 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54ccd7b865-7m2zq"] Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.876009 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54ccd7b865-7m2zq"] Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.897511 5200 scope.go:117] "RemoveContainer" containerID="75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af" Nov 26 15:01:02 crc kubenswrapper[5200]: E1126 15:01:02.898113 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af\": container with ID starting with 75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af not found: ID does not exist" containerID="75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.898188 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af"} err="failed to get container status \"75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af\": rpc error: code = NotFound desc = could not find container \"75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af\": container with ID starting with 75cf4c87bc02d95411fa0538785f9178e518c63f713945e3bc17e1894f8f27af not found: ID does not exist" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.898222 5200 scope.go:117] "RemoveContainer" containerID="04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34" Nov 26 15:01:02 crc kubenswrapper[5200]: E1126 15:01:02.899105 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34\": container with ID starting with 04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34 not found: ID does not exist" containerID="04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.899204 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34"} err="failed to get container status \"04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34\": rpc error: code = NotFound desc = could not find container \"04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34\": container with ID starting with 04f60f41286487c7425e891f7cbbb637b669d1b884fed3a1c48c5cd44d353d34 not found: ID does not exist" Nov 26 15:01:02 crc kubenswrapper[5200]: I1126 15:01:02.990620 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28ec540-6c3a-456f-89ed-98f5b56910c9" path="/var/lib/kubelet/pods/f28ec540-6c3a-456f-89ed-98f5b56910c9/volumes" Nov 26 15:01:03 crc kubenswrapper[5200]: I1126 15:01:03.823785 5200 generic.go:358] "Generic (PLEG): container finished" podID="23c934e2-7b5a-4567-aac8-b8ffea246b3d" containerID="6ade38ec45169e88cab8343560e164383782d471569d7815e3b23c0b3b0f1799" exitCode=0 Nov 26 15:01:03 crc kubenswrapper[5200]: I1126 15:01:03.823915 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402821-wtjlz" event={"ID":"23c934e2-7b5a-4567-aac8-b8ffea246b3d","Type":"ContainerDied","Data":"6ade38ec45169e88cab8343560e164383782d471569d7815e3b23c0b3b0f1799"} Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.256610 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.321734 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9npg\" (UniqueName: \"kubernetes.io/projected/23c934e2-7b5a-4567-aac8-b8ffea246b3d-kube-api-access-z9npg\") pod \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.321962 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-fernet-keys\") pod \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.322056 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-config-data\") pod \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.322322 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-combined-ca-bundle\") pod \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\" (UID: \"23c934e2-7b5a-4567-aac8-b8ffea246b3d\") " Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.329784 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "23c934e2-7b5a-4567-aac8-b8ffea246b3d" (UID: "23c934e2-7b5a-4567-aac8-b8ffea246b3d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.330490 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c934e2-7b5a-4567-aac8-b8ffea246b3d-kube-api-access-z9npg" (OuterVolumeSpecName: "kube-api-access-z9npg") pod "23c934e2-7b5a-4567-aac8-b8ffea246b3d" (UID: "23c934e2-7b5a-4567-aac8-b8ffea246b3d"). InnerVolumeSpecName "kube-api-access-z9npg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.354499 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23c934e2-7b5a-4567-aac8-b8ffea246b3d" (UID: "23c934e2-7b5a-4567-aac8-b8ffea246b3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.391985 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-config-data" (OuterVolumeSpecName: "config-data") pod "23c934e2-7b5a-4567-aac8-b8ffea246b3d" (UID: "23c934e2-7b5a-4567-aac8-b8ffea246b3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.427999 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.428060 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z9npg\" (UniqueName: \"kubernetes.io/projected/23c934e2-7b5a-4567-aac8-b8ffea246b3d-kube-api-access-z9npg\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.428081 5200 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.428098 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c934e2-7b5a-4567-aac8-b8ffea246b3d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.851533 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402821-wtjlz" event={"ID":"23c934e2-7b5a-4567-aac8-b8ffea246b3d","Type":"ContainerDied","Data":"ac34439a2906cc9dc5a0f2e06c9e23050ac87fe926f4385b13f93ced51970116"} Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.852031 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac34439a2906cc9dc5a0f2e06c9e23050ac87fe926f4385b13f93ced51970116" Nov 26 15:01:05 crc kubenswrapper[5200]: I1126 15:01:05.851576 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402821-wtjlz" Nov 26 15:01:07 crc kubenswrapper[5200]: I1126 15:01:07.188593 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54ccd7b865-7m2zq" podUID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.31:5353: i/o timeout" Nov 26 15:01:25 crc kubenswrapper[5200]: I1126 15:01:25.715233 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7d9c6594ff-hl6n8" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.483266 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-n5j7m"] Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.484931 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23c934e2-7b5a-4567-aac8-b8ffea246b3d" containerName="keystone-cron" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.484948 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c934e2-7b5a-4567-aac8-b8ffea246b3d" containerName="keystone-cron" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.485000 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerName="dnsmasq-dns" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.485006 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerName="dnsmasq-dns" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.485022 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerName="init" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.485027 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerName="init" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.485189 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f28ec540-6c3a-456f-89ed-98f5b56910c9" containerName="dnsmasq-dns" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.485211 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="23c934e2-7b5a-4567-aac8-b8ffea246b3d" containerName="keystone-cron" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.496203 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.498099 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n5j7m"] Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.600976 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed77c7b7-3c1a-494e-a213-d83443e9692c-operator-scripts\") pod \"glance-db-create-n5j7m\" (UID: \"ed77c7b7-3c1a-494e-a213-d83443e9692c\") " pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.601146 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqx2v\" (UniqueName: \"kubernetes.io/projected/ed77c7b7-3c1a-494e-a213-d83443e9692c-kube-api-access-fqx2v\") pod \"glance-db-create-n5j7m\" (UID: \"ed77c7b7-3c1a-494e-a213-d83443e9692c\") " pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.601269 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-1d2c-account-create-update-skhvb"] Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.607608 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.608941 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1d2c-account-create-update-skhvb"] Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.612714 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-db-secret\"" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.703176 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqx2v\" (UniqueName: \"kubernetes.io/projected/ed77c7b7-3c1a-494e-a213-d83443e9692c-kube-api-access-fqx2v\") pod \"glance-db-create-n5j7m\" (UID: \"ed77c7b7-3c1a-494e-a213-d83443e9692c\") " pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.703234 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sknxm\" (UniqueName: \"kubernetes.io/projected/94489885-78b5-4375-8728-0ea952a5ae0d-kube-api-access-sknxm\") pod \"glance-1d2c-account-create-update-skhvb\" (UID: \"94489885-78b5-4375-8728-0ea952a5ae0d\") " pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.703306 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94489885-78b5-4375-8728-0ea952a5ae0d-operator-scripts\") pod \"glance-1d2c-account-create-update-skhvb\" (UID: \"94489885-78b5-4375-8728-0ea952a5ae0d\") " pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.703339 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed77c7b7-3c1a-494e-a213-d83443e9692c-operator-scripts\") pod \"glance-db-create-n5j7m\" (UID: \"ed77c7b7-3c1a-494e-a213-d83443e9692c\") " pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.704191 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed77c7b7-3c1a-494e-a213-d83443e9692c-operator-scripts\") pod \"glance-db-create-n5j7m\" (UID: \"ed77c7b7-3c1a-494e-a213-d83443e9692c\") " pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.736026 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqx2v\" (UniqueName: \"kubernetes.io/projected/ed77c7b7-3c1a-494e-a213-d83443e9692c-kube-api-access-fqx2v\") pod \"glance-db-create-n5j7m\" (UID: \"ed77c7b7-3c1a-494e-a213-d83443e9692c\") " pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.805504 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sknxm\" (UniqueName: \"kubernetes.io/projected/94489885-78b5-4375-8728-0ea952a5ae0d-kube-api-access-sknxm\") pod \"glance-1d2c-account-create-update-skhvb\" (UID: \"94489885-78b5-4375-8728-0ea952a5ae0d\") " pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.806098 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94489885-78b5-4375-8728-0ea952a5ae0d-operator-scripts\") pod \"glance-1d2c-account-create-update-skhvb\" (UID: \"94489885-78b5-4375-8728-0ea952a5ae0d\") " pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.807105 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94489885-78b5-4375-8728-0ea952a5ae0d-operator-scripts\") pod \"glance-1d2c-account-create-update-skhvb\" (UID: \"94489885-78b5-4375-8728-0ea952a5ae0d\") " pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.828157 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.829635 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sknxm\" (UniqueName: \"kubernetes.io/projected/94489885-78b5-4375-8728-0ea952a5ae0d-kube-api-access-sknxm\") pod \"glance-1d2c-account-create-update-skhvb\" (UID: \"94489885-78b5-4375-8728-0ea952a5ae0d\") " pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:32 crc kubenswrapper[5200]: I1126 15:01:32.943916 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:33 crc kubenswrapper[5200]: I1126 15:01:33.318323 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n5j7m"] Nov 26 15:01:33 crc kubenswrapper[5200]: W1126 15:01:33.322746 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded77c7b7_3c1a_494e_a213_d83443e9692c.slice/crio-c5214eb57dcaa642f8b1b13daf172c195a13f90840c809a49b5ea495c76c1794 WatchSource:0}: Error finding container c5214eb57dcaa642f8b1b13daf172c195a13f90840c809a49b5ea495c76c1794: Status 404 returned error can't find the container with id c5214eb57dcaa642f8b1b13daf172c195a13f90840c809a49b5ea495c76c1794 Nov 26 15:01:33 crc kubenswrapper[5200]: I1126 15:01:33.456156 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1d2c-account-create-update-skhvb"] Nov 26 15:01:34 crc kubenswrapper[5200]: I1126 15:01:34.210931 5200 generic.go:358] "Generic (PLEG): container finished" podID="94489885-78b5-4375-8728-0ea952a5ae0d" containerID="8c982c4bd2ad1f0fb0adb81722788c3ebbefc29eaa35db0d7852776203276a8d" exitCode=0 Nov 26 15:01:34 crc kubenswrapper[5200]: I1126 15:01:34.211173 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d2c-account-create-update-skhvb" event={"ID":"94489885-78b5-4375-8728-0ea952a5ae0d","Type":"ContainerDied","Data":"8c982c4bd2ad1f0fb0adb81722788c3ebbefc29eaa35db0d7852776203276a8d"} Nov 26 15:01:34 crc kubenswrapper[5200]: I1126 15:01:34.211548 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d2c-account-create-update-skhvb" event={"ID":"94489885-78b5-4375-8728-0ea952a5ae0d","Type":"ContainerStarted","Data":"f9e74753734e0bdf52b71d793cabfa67a70283c43d4a88a966d9c6b484a1dd93"} Nov 26 15:01:34 crc kubenswrapper[5200]: I1126 15:01:34.215580 5200 generic.go:358] "Generic (PLEG): container finished" podID="ed77c7b7-3c1a-494e-a213-d83443e9692c" containerID="1b1d687c2f10d9506213f440666689318dadba51c6186795df7628aec2c3bc42" exitCode=0 Nov 26 15:01:34 crc kubenswrapper[5200]: I1126 15:01:34.215722 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n5j7m" event={"ID":"ed77c7b7-3c1a-494e-a213-d83443e9692c","Type":"ContainerDied","Data":"1b1d687c2f10d9506213f440666689318dadba51c6186795df7628aec2c3bc42"} Nov 26 15:01:34 crc kubenswrapper[5200]: I1126 15:01:34.215775 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n5j7m" event={"ID":"ed77c7b7-3c1a-494e-a213-d83443e9692c","Type":"ContainerStarted","Data":"c5214eb57dcaa642f8b1b13daf172c195a13f90840c809a49b5ea495c76c1794"} Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.661680 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.667761 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.689843 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94489885-78b5-4375-8728-0ea952a5ae0d-operator-scripts\") pod \"94489885-78b5-4375-8728-0ea952a5ae0d\" (UID: \"94489885-78b5-4375-8728-0ea952a5ae0d\") " Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.690098 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed77c7b7-3c1a-494e-a213-d83443e9692c-operator-scripts\") pod \"ed77c7b7-3c1a-494e-a213-d83443e9692c\" (UID: \"ed77c7b7-3c1a-494e-a213-d83443e9692c\") " Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.690530 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqx2v\" (UniqueName: \"kubernetes.io/projected/ed77c7b7-3c1a-494e-a213-d83443e9692c-kube-api-access-fqx2v\") pod \"ed77c7b7-3c1a-494e-a213-d83443e9692c\" (UID: \"ed77c7b7-3c1a-494e-a213-d83443e9692c\") " Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.690719 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sknxm\" (UniqueName: \"kubernetes.io/projected/94489885-78b5-4375-8728-0ea952a5ae0d-kube-api-access-sknxm\") pod \"94489885-78b5-4375-8728-0ea952a5ae0d\" (UID: \"94489885-78b5-4375-8728-0ea952a5ae0d\") " Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.696250 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed77c7b7-3c1a-494e-a213-d83443e9692c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed77c7b7-3c1a-494e-a213-d83443e9692c" (UID: "ed77c7b7-3c1a-494e-a213-d83443e9692c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.696281 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94489885-78b5-4375-8728-0ea952a5ae0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94489885-78b5-4375-8728-0ea952a5ae0d" (UID: "94489885-78b5-4375-8728-0ea952a5ae0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.706671 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94489885-78b5-4375-8728-0ea952a5ae0d-kube-api-access-sknxm" (OuterVolumeSpecName: "kube-api-access-sknxm") pod "94489885-78b5-4375-8728-0ea952a5ae0d" (UID: "94489885-78b5-4375-8728-0ea952a5ae0d"). InnerVolumeSpecName "kube-api-access-sknxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.706753 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed77c7b7-3c1a-494e-a213-d83443e9692c-kube-api-access-fqx2v" (OuterVolumeSpecName: "kube-api-access-fqx2v") pod "ed77c7b7-3c1a-494e-a213-d83443e9692c" (UID: "ed77c7b7-3c1a-494e-a213-d83443e9692c"). InnerVolumeSpecName "kube-api-access-fqx2v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.795744 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sknxm\" (UniqueName: \"kubernetes.io/projected/94489885-78b5-4375-8728-0ea952a5ae0d-kube-api-access-sknxm\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.795799 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94489885-78b5-4375-8728-0ea952a5ae0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.795818 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed77c7b7-3c1a-494e-a213-d83443e9692c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:35 crc kubenswrapper[5200]: I1126 15:01:35.795835 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fqx2v\" (UniqueName: \"kubernetes.io/projected/ed77c7b7-3c1a-494e-a213-d83443e9692c-kube-api-access-fqx2v\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:36 crc kubenswrapper[5200]: I1126 15:01:36.240678 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d2c-account-create-update-skhvb" event={"ID":"94489885-78b5-4375-8728-0ea952a5ae0d","Type":"ContainerDied","Data":"f9e74753734e0bdf52b71d793cabfa67a70283c43d4a88a966d9c6b484a1dd93"} Nov 26 15:01:36 crc kubenswrapper[5200]: I1126 15:01:36.241035 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9e74753734e0bdf52b71d793cabfa67a70283c43d4a88a966d9c6b484a1dd93" Nov 26 15:01:36 crc kubenswrapper[5200]: I1126 15:01:36.240733 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d2c-account-create-update-skhvb" Nov 26 15:01:36 crc kubenswrapper[5200]: I1126 15:01:36.244099 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n5j7m" event={"ID":"ed77c7b7-3c1a-494e-a213-d83443e9692c","Type":"ContainerDied","Data":"c5214eb57dcaa642f8b1b13daf172c195a13f90840c809a49b5ea495c76c1794"} Nov 26 15:01:36 crc kubenswrapper[5200]: I1126 15:01:36.244146 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5214eb57dcaa642f8b1b13daf172c195a13f90840c809a49b5ea495c76c1794" Nov 26 15:01:36 crc kubenswrapper[5200]: I1126 15:01:36.244128 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n5j7m" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.819145 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-z6gzq"] Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.821874 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed77c7b7-3c1a-494e-a213-d83443e9692c" containerName="mariadb-database-create" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.821913 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed77c7b7-3c1a-494e-a213-d83443e9692c" containerName="mariadb-database-create" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.822002 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94489885-78b5-4375-8728-0ea952a5ae0d" containerName="mariadb-account-create-update" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.822016 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="94489885-78b5-4375-8728-0ea952a5ae0d" containerName="mariadb-account-create-update" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.822364 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="94489885-78b5-4375-8728-0ea952a5ae0d" containerName="mariadb-account-create-update" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.822408 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed77c7b7-3c1a-494e-a213-d83443e9692c" containerName="mariadb-database-create" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.833901 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-z6gzq"] Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.834099 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.840472 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-config-data\"" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.841021 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-qwlg9\"" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.969925 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-config-data\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.970273 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-db-sync-config-data\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.970416 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwx5s\" (UniqueName: \"kubernetes.io/projected/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-kube-api-access-qwx5s\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:37 crc kubenswrapper[5200]: I1126 15:01:37.970488 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-combined-ca-bundle\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.073073 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-config-data\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.073165 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-db-sync-config-data\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.073251 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwx5s\" (UniqueName: \"kubernetes.io/projected/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-kube-api-access-qwx5s\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.073304 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-combined-ca-bundle\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.096004 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-config-data\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.096177 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-combined-ca-bundle\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.096765 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-db-sync-config-data\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.098615 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwx5s\" (UniqueName: \"kubernetes.io/projected/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-kube-api-access-qwx5s\") pod \"glance-db-sync-z6gzq\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.213388 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:38 crc kubenswrapper[5200]: I1126 15:01:38.591029 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-z6gzq"] Nov 26 15:01:39 crc kubenswrapper[5200]: I1126 15:01:39.299400 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z6gzq" event={"ID":"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88","Type":"ContainerStarted","Data":"0608a7e591d23a06090c96b5040766ee633627779c36512f36903539a6ea1296"} Nov 26 15:01:40 crc kubenswrapper[5200]: I1126 15:01:40.313092 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z6gzq" event={"ID":"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88","Type":"ContainerStarted","Data":"039d36707774de098551c0fac8fb6c7ebfc05f6cfa89e3498b9025cfb5e15c1a"} Nov 26 15:01:40 crc kubenswrapper[5200]: I1126 15:01:40.345058 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-z6gzq" podStartSLOduration=3.3450317099999998 podStartE2EDuration="3.34503171s" podCreationTimestamp="2025-11-26 15:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:40.343760356 +0000 UTC m=+5469.022962214" watchObservedRunningTime="2025-11-26 15:01:40.34503171 +0000 UTC m=+5469.024233538" Nov 26 15:01:42 crc kubenswrapper[5200]: E1126 15:01:42.088996 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1585002 actualBytes=10240 Nov 26 15:01:43 crc kubenswrapper[5200]: I1126 15:01:43.353034 5200 generic.go:358] "Generic (PLEG): container finished" podID="80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" containerID="039d36707774de098551c0fac8fb6c7ebfc05f6cfa89e3498b9025cfb5e15c1a" exitCode=0 Nov 26 15:01:43 crc kubenswrapper[5200]: I1126 15:01:43.353170 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z6gzq" event={"ID":"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88","Type":"ContainerDied","Data":"039d36707774de098551c0fac8fb6c7ebfc05f6cfa89e3498b9025cfb5e15c1a"} Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.813245 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.835439 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-combined-ca-bundle\") pod \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.835514 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-config-data\") pod \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.835580 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-db-sync-config-data\") pod \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.835619 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwx5s\" (UniqueName: \"kubernetes.io/projected/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-kube-api-access-qwx5s\") pod \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\" (UID: \"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88\") " Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.845651 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" (UID: "80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.845646 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-kube-api-access-qwx5s" (OuterVolumeSpecName: "kube-api-access-qwx5s") pod "80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" (UID: "80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88"). InnerVolumeSpecName "kube-api-access-qwx5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.868952 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" (UID: "80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.891479 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-config-data" (OuterVolumeSpecName: "config-data") pod "80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" (UID: "80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.938048 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.938371 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.938447 5200 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:44 crc kubenswrapper[5200]: I1126 15:01:44.938507 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwx5s\" (UniqueName: \"kubernetes.io/projected/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88-kube-api-access-qwx5s\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.380104 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z6gzq" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.380103 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z6gzq" event={"ID":"80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88","Type":"ContainerDied","Data":"0608a7e591d23a06090c96b5040766ee633627779c36512f36903539a6ea1296"} Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.380248 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0608a7e591d23a06090c96b5040766ee633627779c36512f36903539a6ea1296" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.735725 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.739165 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" containerName="glance-db-sync" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.739287 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" containerName="glance-db-sync" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.739912 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" containerName="glance-db-sync" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.753488 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.761062 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceph-conf-files\"" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.761068 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-scripts\"" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.785219 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.787387 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-qwlg9\"" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.793177 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.855773 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6454d795d5-jcpws"] Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.863104 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b2kz\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-kube-api-access-5b2kz\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.863169 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.863200 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.863247 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.863321 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-ceph\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.863365 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-logs\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.863428 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.877927 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6454d795d5-jcpws"] Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.878155 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.965620 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.965725 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-config\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.965768 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-ceph\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.965799 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qs94\" (UniqueName: \"kubernetes.io/projected/f1dc7223-d5b9-4e51-a05e-b5a86affce90-kube-api-access-4qs94\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.965850 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-logs\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.965929 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.965958 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-dns-svc\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.965983 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-nb\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.966203 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-sb\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.966235 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b2kz\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-kube-api-access-5b2kz\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.966257 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.966305 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.966916 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.972099 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-logs\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.982041 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.982749 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.985310 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-ceph\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:45 crc kubenswrapper[5200]: I1126 15:01:45.993806 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b2kz\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-kube-api-access-5b2kz\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.002962 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.011656 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.067893 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-dns-svc\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.068496 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-nb\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.068682 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-sb\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.068996 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-config\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.069136 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qs94\" (UniqueName: \"kubernetes.io/projected/f1dc7223-d5b9-4e51-a05e-b5a86affce90-kube-api-access-4qs94\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.069459 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-nb\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.070180 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-sb\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.070321 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-config\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.070970 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-dns-svc\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.089197 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qs94\" (UniqueName: \"kubernetes.io/projected/f1dc7223-d5b9-4e51-a05e-b5a86affce90-kube-api-access-4qs94\") pod \"dnsmasq-dns-6454d795d5-jcpws\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.100868 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.120768 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.120946 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.123185 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.171348 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.171722 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.171866 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.171976 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2q2\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-kube-api-access-zm2q2\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.172075 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.172158 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.172291 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.211036 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.274412 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.274482 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2q2\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-kube-api-access-zm2q2\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.274511 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.274534 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.274599 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.274623 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.274695 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.275620 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.276566 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.280646 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.282217 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.282660 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.290838 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.293796 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2q2\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-kube-api-access-zm2q2\") pod \"glance-default-internal-api-0\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.513932 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.519672 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:46 crc kubenswrapper[5200]: W1126 15:01:46.540153 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37f0f878_df1d_4f0e_be73_516675b2d3ac.slice/crio-33091930443abe41859d2788f98b1907ebc6ef52cdd3275deeda4cc3d462bdd6 WatchSource:0}: Error finding container 33091930443abe41859d2788f98b1907ebc6ef52cdd3275deeda4cc3d462bdd6: Status 404 returned error can't find the container with id 33091930443abe41859d2788f98b1907ebc6ef52cdd3275deeda4cc3d462bdd6 Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.784274 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6454d795d5-jcpws"] Nov 26 15:01:46 crc kubenswrapper[5200]: I1126 15:01:46.866816 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:47 crc kubenswrapper[5200]: I1126 15:01:47.127966 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:47 crc kubenswrapper[5200]: W1126 15:01:47.139058 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75eabd9d_8af5_4923_9f7b_1782dfb686e6.slice/crio-7b10bd889cc973cdad41713468fc4f9e03995e6ab37de4499dd5b23cd2888178 WatchSource:0}: Error finding container 7b10bd889cc973cdad41713468fc4f9e03995e6ab37de4499dd5b23cd2888178: Status 404 returned error can't find the container with id 7b10bd889cc973cdad41713468fc4f9e03995e6ab37de4499dd5b23cd2888178 Nov 26 15:01:47 crc kubenswrapper[5200]: I1126 15:01:47.415167 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37f0f878-df1d-4f0e-be73-516675b2d3ac","Type":"ContainerStarted","Data":"6075b08fbd21839b843d7d68d1b8756c6fd3c0667848854ebae0d50349ed9fdd"} Nov 26 15:01:47 crc kubenswrapper[5200]: I1126 15:01:47.415793 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37f0f878-df1d-4f0e-be73-516675b2d3ac","Type":"ContainerStarted","Data":"33091930443abe41859d2788f98b1907ebc6ef52cdd3275deeda4cc3d462bdd6"} Nov 26 15:01:47 crc kubenswrapper[5200]: I1126 15:01:47.418233 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75eabd9d-8af5-4923-9f7b-1782dfb686e6","Type":"ContainerStarted","Data":"7b10bd889cc973cdad41713468fc4f9e03995e6ab37de4499dd5b23cd2888178"} Nov 26 15:01:47 crc kubenswrapper[5200]: I1126 15:01:47.419585 5200 generic.go:358] "Generic (PLEG): container finished" podID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" containerID="257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74" exitCode=0 Nov 26 15:01:47 crc kubenswrapper[5200]: I1126 15:01:47.419981 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" event={"ID":"f1dc7223-d5b9-4e51-a05e-b5a86affce90","Type":"ContainerDied","Data":"257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74"} Nov 26 15:01:47 crc kubenswrapper[5200]: I1126 15:01:47.420035 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" event={"ID":"f1dc7223-d5b9-4e51-a05e-b5a86affce90","Type":"ContainerStarted","Data":"12f536236c606ebef0cc00422b2d3f8f7a3a7d2191166d574bf6a3c628a6d7db"} Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.439429 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37f0f878-df1d-4f0e-be73-516675b2d3ac","Type":"ContainerStarted","Data":"7f39c454ac7e798c226e8e6f42eda2a9f0f7831bf59407a3071188f6f1ec3d4f"} Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.439528 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerName="glance-log" containerID="cri-o://6075b08fbd21839b843d7d68d1b8756c6fd3c0667848854ebae0d50349ed9fdd" gracePeriod=30 Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.439655 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerName="glance-httpd" containerID="cri-o://7f39c454ac7e798c226e8e6f42eda2a9f0f7831bf59407a3071188f6f1ec3d4f" gracePeriod=30 Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.450574 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75eabd9d-8af5-4923-9f7b-1782dfb686e6","Type":"ContainerStarted","Data":"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5"} Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.459409 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" event={"ID":"f1dc7223-d5b9-4e51-a05e-b5a86affce90","Type":"ContainerStarted","Data":"5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57"} Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.461629 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.486538 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.486506694 podStartE2EDuration="3.486506694s" podCreationTimestamp="2025-11-26 15:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:48.467855162 +0000 UTC m=+5477.147056980" watchObservedRunningTime="2025-11-26 15:01:48.486506694 +0000 UTC m=+5477.165708512" Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.503131 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" podStartSLOduration=3.503106862 podStartE2EDuration="3.503106862s" podCreationTimestamp="2025-11-26 15:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:48.490029327 +0000 UTC m=+5477.169231155" watchObservedRunningTime="2025-11-26 15:01:48.503106862 +0000 UTC m=+5477.182308680" Nov 26 15:01:48 crc kubenswrapper[5200]: I1126 15:01:48.966280 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.471068 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75eabd9d-8af5-4923-9f7b-1782dfb686e6","Type":"ContainerStarted","Data":"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7"} Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.473638 5200 generic.go:358] "Generic (PLEG): container finished" podID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerID="7f39c454ac7e798c226e8e6f42eda2a9f0f7831bf59407a3071188f6f1ec3d4f" exitCode=0 Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.473668 5200 generic.go:358] "Generic (PLEG): container finished" podID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerID="6075b08fbd21839b843d7d68d1b8756c6fd3c0667848854ebae0d50349ed9fdd" exitCode=143 Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.474865 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37f0f878-df1d-4f0e-be73-516675b2d3ac","Type":"ContainerDied","Data":"7f39c454ac7e798c226e8e6f42eda2a9f0f7831bf59407a3071188f6f1ec3d4f"} Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.474899 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37f0f878-df1d-4f0e-be73-516675b2d3ac","Type":"ContainerDied","Data":"6075b08fbd21839b843d7d68d1b8756c6fd3c0667848854ebae0d50349ed9fdd"} Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.474914 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37f0f878-df1d-4f0e-be73-516675b2d3ac","Type":"ContainerDied","Data":"33091930443abe41859d2788f98b1907ebc6ef52cdd3275deeda4cc3d462bdd6"} Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.474926 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33091930443abe41859d2788f98b1907ebc6ef52cdd3275deeda4cc3d462bdd6" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.493815 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.493797124 podStartE2EDuration="4.493797124s" podCreationTimestamp="2025-11-26 15:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:49.492381607 +0000 UTC m=+5478.171583435" watchObservedRunningTime="2025-11-26 15:01:49.493797124 +0000 UTC m=+5478.172998942" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.505821 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.659589 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-scripts\") pod \"37f0f878-df1d-4f0e-be73-516675b2d3ac\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.659743 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-combined-ca-bundle\") pod \"37f0f878-df1d-4f0e-be73-516675b2d3ac\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.659820 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b2kz\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-kube-api-access-5b2kz\") pod \"37f0f878-df1d-4f0e-be73-516675b2d3ac\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.659881 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-httpd-run\") pod \"37f0f878-df1d-4f0e-be73-516675b2d3ac\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.659970 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-ceph\") pod \"37f0f878-df1d-4f0e-be73-516675b2d3ac\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.660050 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-config-data\") pod \"37f0f878-df1d-4f0e-be73-516675b2d3ac\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.660098 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-logs\") pod \"37f0f878-df1d-4f0e-be73-516675b2d3ac\" (UID: \"37f0f878-df1d-4f0e-be73-516675b2d3ac\") " Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.661093 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-logs" (OuterVolumeSpecName: "logs") pod "37f0f878-df1d-4f0e-be73-516675b2d3ac" (UID: "37f0f878-df1d-4f0e-be73-516675b2d3ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.661150 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "37f0f878-df1d-4f0e-be73-516675b2d3ac" (UID: "37f0f878-df1d-4f0e-be73-516675b2d3ac"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.666888 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-ceph" (OuterVolumeSpecName: "ceph") pod "37f0f878-df1d-4f0e-be73-516675b2d3ac" (UID: "37f0f878-df1d-4f0e-be73-516675b2d3ac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.667559 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-kube-api-access-5b2kz" (OuterVolumeSpecName: "kube-api-access-5b2kz") pod "37f0f878-df1d-4f0e-be73-516675b2d3ac" (UID: "37f0f878-df1d-4f0e-be73-516675b2d3ac"). InnerVolumeSpecName "kube-api-access-5b2kz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.668260 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-scripts" (OuterVolumeSpecName: "scripts") pod "37f0f878-df1d-4f0e-be73-516675b2d3ac" (UID: "37f0f878-df1d-4f0e-be73-516675b2d3ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.686916 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37f0f878-df1d-4f0e-be73-516675b2d3ac" (UID: "37f0f878-df1d-4f0e-be73-516675b2d3ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.737812 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-config-data" (OuterVolumeSpecName: "config-data") pod "37f0f878-df1d-4f0e-be73-516675b2d3ac" (UID: "37f0f878-df1d-4f0e-be73-516675b2d3ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.762783 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.762826 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.762905 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.762920 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f0f878-df1d-4f0e-be73-516675b2d3ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.762936 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5b2kz\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-kube-api-access-5b2kz\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.762950 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37f0f878-df1d-4f0e-be73-516675b2d3ac-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:49 crc kubenswrapper[5200]: I1126 15:01:49.762961 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37f0f878-df1d-4f0e-be73-516675b2d3ac-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.490931 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerName="glance-log" containerID="cri-o://acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5" gracePeriod=30 Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.491378 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerName="glance-httpd" containerID="cri-o://2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7" gracePeriod=30 Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.491734 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.531662 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.538064 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.584168 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.586664 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerName="glance-httpd" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.586769 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerName="glance-httpd" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.586867 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerName="glance-log" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.586880 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerName="glance-log" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.587211 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerName="glance-httpd" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.587257 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" containerName="glance-log" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.595931 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.599753 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.600972 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.689065 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-logs\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.689126 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-config-data\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.689150 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-scripts\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.689171 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.689230 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.689267 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-ceph\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.689315 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmj8\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-kube-api-access-7lmj8\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.790841 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-logs\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.791024 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-config-data\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.791119 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-scripts\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.791213 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.791605 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.791816 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-ceph\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.792072 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.792222 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-logs\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.792831 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmj8\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-kube-api-access-7lmj8\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.796544 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-scripts\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.796549 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.798191 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-config-data\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.800906 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-ceph\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.811143 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmj8\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-kube-api-access-7lmj8\") pod \"glance-default-external-api-0\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " pod="openstack/glance-default-external-api-0" Nov 26 15:01:50 crc kubenswrapper[5200]: I1126 15:01:50.987368 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f0f878-df1d-4f0e-be73-516675b2d3ac" path="/var/lib/kubelet/pods/37f0f878-df1d-4f0e-be73-516675b2d3ac/volumes" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.065468 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.359867 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.504404 5200 generic.go:358] "Generic (PLEG): container finished" podID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerID="2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7" exitCode=0 Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.504944 5200 generic.go:358] "Generic (PLEG): container finished" podID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerID="acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5" exitCode=143 Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.504511 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75eabd9d-8af5-4923-9f7b-1782dfb686e6","Type":"ContainerDied","Data":"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7"} Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.505054 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75eabd9d-8af5-4923-9f7b-1782dfb686e6","Type":"ContainerDied","Data":"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5"} Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.505076 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"75eabd9d-8af5-4923-9f7b-1782dfb686e6","Type":"ContainerDied","Data":"7b10bd889cc973cdad41713468fc4f9e03995e6ab37de4499dd5b23cd2888178"} Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.504594 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.505095 5200 scope.go:117] "RemoveContainer" containerID="2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.514625 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-ceph\") pod \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.515007 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-scripts\") pod \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.515132 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-logs\") pod \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.515585 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-combined-ca-bundle\") pod \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.515638 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-logs" (OuterVolumeSpecName: "logs") pod "75eabd9d-8af5-4923-9f7b-1782dfb686e6" (UID: "75eabd9d-8af5-4923-9f7b-1782dfb686e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.515657 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-httpd-run\") pod \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.515858 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-config-data\") pod \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.515973 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "75eabd9d-8af5-4923-9f7b-1782dfb686e6" (UID: "75eabd9d-8af5-4923-9f7b-1782dfb686e6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.516275 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm2q2\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-kube-api-access-zm2q2\") pod \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\" (UID: \"75eabd9d-8af5-4923-9f7b-1782dfb686e6\") " Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.518209 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.518228 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75eabd9d-8af5-4923-9f7b-1782dfb686e6-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.521488 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-kube-api-access-zm2q2" (OuterVolumeSpecName: "kube-api-access-zm2q2") pod "75eabd9d-8af5-4923-9f7b-1782dfb686e6" (UID: "75eabd9d-8af5-4923-9f7b-1782dfb686e6"). InnerVolumeSpecName "kube-api-access-zm2q2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.521624 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-ceph" (OuterVolumeSpecName: "ceph") pod "75eabd9d-8af5-4923-9f7b-1782dfb686e6" (UID: "75eabd9d-8af5-4923-9f7b-1782dfb686e6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.522121 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-scripts" (OuterVolumeSpecName: "scripts") pod "75eabd9d-8af5-4923-9f7b-1782dfb686e6" (UID: "75eabd9d-8af5-4923-9f7b-1782dfb686e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.539191 5200 scope.go:117] "RemoveContainer" containerID="acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.568746 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75eabd9d-8af5-4923-9f7b-1782dfb686e6" (UID: "75eabd9d-8af5-4923-9f7b-1782dfb686e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.576293 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-config-data" (OuterVolumeSpecName: "config-data") pod "75eabd9d-8af5-4923-9f7b-1782dfb686e6" (UID: "75eabd9d-8af5-4923-9f7b-1782dfb686e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.583274 5200 scope.go:117] "RemoveContainer" containerID="2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7" Nov 26 15:01:51 crc kubenswrapper[5200]: E1126 15:01:51.587499 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7\": container with ID starting with 2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7 not found: ID does not exist" containerID="2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.587548 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7"} err="failed to get container status \"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7\": rpc error: code = NotFound desc = could not find container \"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7\": container with ID starting with 2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7 not found: ID does not exist" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.587619 5200 scope.go:117] "RemoveContainer" containerID="acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5" Nov 26 15:01:51 crc kubenswrapper[5200]: E1126 15:01:51.588358 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5\": container with ID starting with acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5 not found: ID does not exist" containerID="acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.588407 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5"} err="failed to get container status \"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5\": rpc error: code = NotFound desc = could not find container \"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5\": container with ID starting with acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5 not found: ID does not exist" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.588436 5200 scope.go:117] "RemoveContainer" containerID="2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.588760 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7"} err="failed to get container status \"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7\": rpc error: code = NotFound desc = could not find container \"2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7\": container with ID starting with 2743bac27999d5992e2d72ac164a774f83d3538e00a6689ad538f0129868dca7 not found: ID does not exist" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.588783 5200 scope.go:117] "RemoveContainer" containerID="acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.589265 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5"} err="failed to get container status \"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5\": rpc error: code = NotFound desc = could not find container \"acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5\": container with ID starting with acc35a91652dbfd83c2d2760ca400207d0da1ec660035169e10ceaa1284288a5 not found: ID does not exist" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.620665 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.620745 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zm2q2\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-kube-api-access-zm2q2\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.620758 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75eabd9d-8af5-4923-9f7b-1782dfb686e6-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.620768 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.620776 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75eabd9d-8af5-4923-9f7b-1782dfb686e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.675642 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.869903 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.894596 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.916635 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.920128 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerName="glance-log" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.920157 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerName="glance-log" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.920193 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerName="glance-httpd" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.920199 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerName="glance-httpd" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.920459 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerName="glance-httpd" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.920484 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" containerName="glance-log" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.930486 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.930657 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:01:51 crc kubenswrapper[5200]: I1126 15:01:51.933122 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.028357 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.028784 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.028814 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.028870 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.028897 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.028912 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.028963 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87v8r\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-kube-api-access-87v8r\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.130828 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.130880 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.131016 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87v8r\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-kube-api-access-87v8r\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.131071 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.131275 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.131325 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.131444 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.133697 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.134604 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.138650 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.139208 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.142897 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.143775 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.153319 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87v8r\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-kube-api-access-87v8r\") pod \"glance-default-internal-api-0\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.254055 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.519929 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3726df62-5cbd-44e0-af53-6097e8a99089","Type":"ContainerStarted","Data":"445e7507e24f17f34ab09694d26415b94e9f7462eb438ca397d5eb12945390b2"} Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.520316 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3726df62-5cbd-44e0-af53-6097e8a99089","Type":"ContainerStarted","Data":"bbdf79a96d18c1a1a0cc50d41d2748a803f9ad64dfbbf23fcc76707fe533e781"} Nov 26 15:01:52 crc kubenswrapper[5200]: I1126 15:01:52.798230 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:01:52 crc kubenswrapper[5200]: W1126 15:01:52.807575 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a9ab2ad_8c80_4821_8223_346343d1c4a3.slice/crio-f26e2a8d8c379cd862663920769bf2c2efd554b6e0fa634a59688bf738e16629 WatchSource:0}: Error finding container f26e2a8d8c379cd862663920769bf2c2efd554b6e0fa634a59688bf738e16629: Status 404 returned error can't find the container with id f26e2a8d8c379cd862663920769bf2c2efd554b6e0fa634a59688bf738e16629 Nov 26 15:01:53 crc kubenswrapper[5200]: I1126 15:01:53.016490 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75eabd9d-8af5-4923-9f7b-1782dfb686e6" path="/var/lib/kubelet/pods/75eabd9d-8af5-4923-9f7b-1782dfb686e6/volumes" Nov 26 15:01:53 crc kubenswrapper[5200]: I1126 15:01:53.540127 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a9ab2ad-8c80-4821-8223-346343d1c4a3","Type":"ContainerStarted","Data":"754d6612e9d989ffc241aa4b07505878a90e8f5e11bd3b22482da80f4bff2ca9"} Nov 26 15:01:53 crc kubenswrapper[5200]: I1126 15:01:53.540694 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a9ab2ad-8c80-4821-8223-346343d1c4a3","Type":"ContainerStarted","Data":"f26e2a8d8c379cd862663920769bf2c2efd554b6e0fa634a59688bf738e16629"} Nov 26 15:01:53 crc kubenswrapper[5200]: I1126 15:01:53.545411 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3726df62-5cbd-44e0-af53-6097e8a99089","Type":"ContainerStarted","Data":"e9c3c7d7e22346be3b1b68caacb77d4a03db0afbbbec4b9af4454e54d843d71a"} Nov 26 15:01:53 crc kubenswrapper[5200]: I1126 15:01:53.576520 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.5764991569999998 podStartE2EDuration="3.576499157s" podCreationTimestamp="2025-11-26 15:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:53.575158451 +0000 UTC m=+5482.254360289" watchObservedRunningTime="2025-11-26 15:01:53.576499157 +0000 UTC m=+5482.255700985" Nov 26 15:01:54 crc kubenswrapper[5200]: I1126 15:01:54.559065 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a9ab2ad-8c80-4821-8223-346343d1c4a3","Type":"ContainerStarted","Data":"dd9951a3e60274cc10eac62488feca37bbf6967de71549cba94f5e87bc4c60d0"} Nov 26 15:01:54 crc kubenswrapper[5200]: I1126 15:01:54.593625 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.5936033849999998 podStartE2EDuration="3.593603385s" podCreationTimestamp="2025-11-26 15:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:54.580479279 +0000 UTC m=+5483.259681137" watchObservedRunningTime="2025-11-26 15:01:54.593603385 +0000 UTC m=+5483.272805213" Nov 26 15:01:55 crc kubenswrapper[5200]: I1126 15:01:55.492919 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:01:55 crc kubenswrapper[5200]: I1126 15:01:55.580481 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdcdf5569-szn2x"] Nov 26 15:01:55 crc kubenswrapper[5200]: I1126 15:01:55.580856 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" podUID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" containerName="dnsmasq-dns" containerID="cri-o://fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9" gracePeriod=10 Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.077078 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.231097 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-nb\") pod \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.231215 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjpp\" (UniqueName: \"kubernetes.io/projected/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-kube-api-access-lxjpp\") pod \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.231475 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-sb\") pod \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.231701 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-dns-svc\") pod \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.231741 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-config\") pod \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\" (UID: \"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f\") " Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.237794 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-kube-api-access-lxjpp" (OuterVolumeSpecName: "kube-api-access-lxjpp") pod "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" (UID: "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f"). InnerVolumeSpecName "kube-api-access-lxjpp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.277115 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" (UID: "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.279111 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" (UID: "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.292298 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-config" (OuterVolumeSpecName: "config") pod "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" (UID: "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.292548 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" (UID: "7d2c07fe-11cb-4d6b-97d2-776cccb00e8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.334622 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.334879 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.335008 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.335161 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lxjpp\" (UniqueName: \"kubernetes.io/projected/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-kube-api-access-lxjpp\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.335307 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.607229 5200 generic.go:358] "Generic (PLEG): container finished" podID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" containerID="fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9" exitCode=0 Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.607277 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" event={"ID":"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f","Type":"ContainerDied","Data":"fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9"} Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.607321 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" event={"ID":"7d2c07fe-11cb-4d6b-97d2-776cccb00e8f","Type":"ContainerDied","Data":"21ecc4c7cb58a99564532bae6e97da3fa50d1dd50025edfca2d538a2f55deb8f"} Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.607343 5200 scope.go:117] "RemoveContainer" containerID="fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.607364 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdcdf5569-szn2x" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.649197 5200 scope.go:117] "RemoveContainer" containerID="09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.658689 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdcdf5569-szn2x"] Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.666495 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdcdf5569-szn2x"] Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.683097 5200 scope.go:117] "RemoveContainer" containerID="fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9" Nov 26 15:01:56 crc kubenswrapper[5200]: E1126 15:01:56.683762 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9\": container with ID starting with fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9 not found: ID does not exist" containerID="fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.683808 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9"} err="failed to get container status \"fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9\": rpc error: code = NotFound desc = could not find container \"fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9\": container with ID starting with fe78d4debe86a3ea77b6a7a49830154a73930f7fd6b9ac1ef1a5c18d29d023b9 not found: ID does not exist" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.683838 5200 scope.go:117] "RemoveContainer" containerID="09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55" Nov 26 15:01:56 crc kubenswrapper[5200]: E1126 15:01:56.684216 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55\": container with ID starting with 09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55 not found: ID does not exist" containerID="09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55" Nov 26 15:01:56 crc kubenswrapper[5200]: I1126 15:01:56.684259 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55"} err="failed to get container status \"09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55\": rpc error: code = NotFound desc = could not find container \"09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55\": container with ID starting with 09bf8933acaf8752d0bc1decd7ea9eb6fe22ab314fff694e99db1464efd19e55 not found: ID does not exist" Nov 26 15:01:57 crc kubenswrapper[5200]: I1126 15:01:57.007244 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" path="/var/lib/kubelet/pods/7d2c07fe-11cb-4d6b-97d2-776cccb00e8f/volumes" Nov 26 15:02:01 crc kubenswrapper[5200]: I1126 15:02:01.066068 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 15:02:01 crc kubenswrapper[5200]: I1126 15:02:01.066372 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 15:02:01 crc kubenswrapper[5200]: I1126 15:02:01.116672 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 15:02:01 crc kubenswrapper[5200]: I1126 15:02:01.140645 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 15:02:01 crc kubenswrapper[5200]: I1126 15:02:01.674388 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Nov 26 15:02:01 crc kubenswrapper[5200]: I1126 15:02:01.674479 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Nov 26 15:02:02 crc kubenswrapper[5200]: I1126 15:02:02.254875 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 15:02:02 crc kubenswrapper[5200]: I1126 15:02:02.255486 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 15:02:02 crc kubenswrapper[5200]: I1126 15:02:02.316463 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 15:02:02 crc kubenswrapper[5200]: I1126 15:02:02.343549 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 15:02:02 crc kubenswrapper[5200]: I1126 15:02:02.688835 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:02:02 crc kubenswrapper[5200]: I1126 15:02:02.689623 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:02:03 crc kubenswrapper[5200]: I1126 15:02:03.664444 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 15:02:03 crc kubenswrapper[5200]: I1126 15:02:03.669150 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 15:02:04 crc kubenswrapper[5200]: I1126 15:02:04.714311 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:02:04 crc kubenswrapper[5200]: I1126 15:02:04.714350 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:02:04 crc kubenswrapper[5200]: I1126 15:02:04.773937 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:02:04 crc kubenswrapper[5200]: I1126 15:02:04.776788 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.798313 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8c2rg"] Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.799883 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" containerName="dnsmasq-dns" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.799899 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" containerName="dnsmasq-dns" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.799955 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" containerName="init" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.799961 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" containerName="init" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.800122 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d2c07fe-11cb-4d6b-97d2-776cccb00e8f" containerName="dnsmasq-dns" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.818944 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8c2rg"] Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.819086 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.900209 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-0fd7-account-create-update-skfbq"] Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.907422 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.910100 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-db-secret\"" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.910284 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0fd7-account-create-update-skfbq"] Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.976659 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f1d368-42ab-4fcf-9938-638ee6cc0734-operator-scripts\") pod \"placement-0fd7-account-create-update-skfbq\" (UID: \"19f1d368-42ab-4fcf-9938-638ee6cc0734\") " pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.977081 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhk92\" (UniqueName: \"kubernetes.io/projected/19f1d368-42ab-4fcf-9938-638ee6cc0734-kube-api-access-rhk92\") pod \"placement-0fd7-account-create-update-skfbq\" (UID: \"19f1d368-42ab-4fcf-9938-638ee6cc0734\") " pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.977202 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-operator-scripts\") pod \"placement-db-create-8c2rg\" (UID: \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\") " pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:12 crc kubenswrapper[5200]: I1126 15:02:12.977304 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmbpj\" (UniqueName: \"kubernetes.io/projected/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-kube-api-access-gmbpj\") pod \"placement-db-create-8c2rg\" (UID: \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\") " pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.079444 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-operator-scripts\") pod \"placement-db-create-8c2rg\" (UID: \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\") " pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.079807 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmbpj\" (UniqueName: \"kubernetes.io/projected/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-kube-api-access-gmbpj\") pod \"placement-db-create-8c2rg\" (UID: \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\") " pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.080186 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f1d368-42ab-4fcf-9938-638ee6cc0734-operator-scripts\") pod \"placement-0fd7-account-create-update-skfbq\" (UID: \"19f1d368-42ab-4fcf-9938-638ee6cc0734\") " pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.080400 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhk92\" (UniqueName: \"kubernetes.io/projected/19f1d368-42ab-4fcf-9938-638ee6cc0734-kube-api-access-rhk92\") pod \"placement-0fd7-account-create-update-skfbq\" (UID: \"19f1d368-42ab-4fcf-9938-638ee6cc0734\") " pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.081074 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f1d368-42ab-4fcf-9938-638ee6cc0734-operator-scripts\") pod \"placement-0fd7-account-create-update-skfbq\" (UID: \"19f1d368-42ab-4fcf-9938-638ee6cc0734\") " pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.081089 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-operator-scripts\") pod \"placement-db-create-8c2rg\" (UID: \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\") " pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.100607 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhk92\" (UniqueName: \"kubernetes.io/projected/19f1d368-42ab-4fcf-9938-638ee6cc0734-kube-api-access-rhk92\") pod \"placement-0fd7-account-create-update-skfbq\" (UID: \"19f1d368-42ab-4fcf-9938-638ee6cc0734\") " pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.102090 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmbpj\" (UniqueName: \"kubernetes.io/projected/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-kube-api-access-gmbpj\") pod \"placement-db-create-8c2rg\" (UID: \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\") " pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.138235 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.229675 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.734350 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8c2rg"] Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.757712 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0fd7-account-create-update-skfbq"] Nov 26 15:02:13 crc kubenswrapper[5200]: W1126 15:02:13.758802 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19f1d368_42ab_4fcf_9938_638ee6cc0734.slice/crio-d7753333d4efef8dc5efeb8fa91008f953a2899a48fe87a36d200e079f143d63 WatchSource:0}: Error finding container d7753333d4efef8dc5efeb8fa91008f953a2899a48fe87a36d200e079f143d63: Status 404 returned error can't find the container with id d7753333d4efef8dc5efeb8fa91008f953a2899a48fe87a36d200e079f143d63 Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.843481 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8c2rg" event={"ID":"7f6b73eb-d75d-4660-b0bd-3095b7197ec7","Type":"ContainerStarted","Data":"437da5230f19fd8849a73d63b47db1aa54e1f31e65ca1f95bbddb63fd6251637"} Nov 26 15:02:13 crc kubenswrapper[5200]: I1126 15:02:13.845715 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0fd7-account-create-update-skfbq" event={"ID":"19f1d368-42ab-4fcf-9938-638ee6cc0734","Type":"ContainerStarted","Data":"d7753333d4efef8dc5efeb8fa91008f953a2899a48fe87a36d200e079f143d63"} Nov 26 15:02:14 crc kubenswrapper[5200]: I1126 15:02:14.861539 5200 generic.go:358] "Generic (PLEG): container finished" podID="19f1d368-42ab-4fcf-9938-638ee6cc0734" containerID="ef195788cc4a2fe50a4020cb9e87cdcd0018720ba15f052926059a68ac16e368" exitCode=0 Nov 26 15:02:14 crc kubenswrapper[5200]: I1126 15:02:14.861637 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0fd7-account-create-update-skfbq" event={"ID":"19f1d368-42ab-4fcf-9938-638ee6cc0734","Type":"ContainerDied","Data":"ef195788cc4a2fe50a4020cb9e87cdcd0018720ba15f052926059a68ac16e368"} Nov 26 15:02:14 crc kubenswrapper[5200]: I1126 15:02:14.865552 5200 generic.go:358] "Generic (PLEG): container finished" podID="7f6b73eb-d75d-4660-b0bd-3095b7197ec7" containerID="2bcce92ad40ea4f23f9aa07247cf89ee67ea1900992828080e5efef46e2926f8" exitCode=0 Nov 26 15:02:14 crc kubenswrapper[5200]: I1126 15:02:14.865772 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8c2rg" event={"ID":"7f6b73eb-d75d-4660-b0bd-3095b7197ec7","Type":"ContainerDied","Data":"2bcce92ad40ea4f23f9aa07247cf89ee67ea1900992828080e5efef46e2926f8"} Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.326914 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.333061 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.387580 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-operator-scripts\") pod \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\" (UID: \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\") " Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.387733 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhk92\" (UniqueName: \"kubernetes.io/projected/19f1d368-42ab-4fcf-9938-638ee6cc0734-kube-api-access-rhk92\") pod \"19f1d368-42ab-4fcf-9938-638ee6cc0734\" (UID: \"19f1d368-42ab-4fcf-9938-638ee6cc0734\") " Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.387797 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmbpj\" (UniqueName: \"kubernetes.io/projected/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-kube-api-access-gmbpj\") pod \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\" (UID: \"7f6b73eb-d75d-4660-b0bd-3095b7197ec7\") " Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.387823 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f1d368-42ab-4fcf-9938-638ee6cc0734-operator-scripts\") pod \"19f1d368-42ab-4fcf-9938-638ee6cc0734\" (UID: \"19f1d368-42ab-4fcf-9938-638ee6cc0734\") " Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.388457 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f1d368-42ab-4fcf-9938-638ee6cc0734-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19f1d368-42ab-4fcf-9938-638ee6cc0734" (UID: "19f1d368-42ab-4fcf-9938-638ee6cc0734"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.388537 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f6b73eb-d75d-4660-b0bd-3095b7197ec7" (UID: "7f6b73eb-d75d-4660-b0bd-3095b7197ec7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.388902 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f1d368-42ab-4fcf-9938-638ee6cc0734-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.389410 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.395025 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-kube-api-access-gmbpj" (OuterVolumeSpecName: "kube-api-access-gmbpj") pod "7f6b73eb-d75d-4660-b0bd-3095b7197ec7" (UID: "7f6b73eb-d75d-4660-b0bd-3095b7197ec7"). InnerVolumeSpecName "kube-api-access-gmbpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.397802 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f1d368-42ab-4fcf-9938-638ee6cc0734-kube-api-access-rhk92" (OuterVolumeSpecName: "kube-api-access-rhk92") pod "19f1d368-42ab-4fcf-9938-638ee6cc0734" (UID: "19f1d368-42ab-4fcf-9938-638ee6cc0734"). InnerVolumeSpecName "kube-api-access-rhk92". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.491258 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rhk92\" (UniqueName: \"kubernetes.io/projected/19f1d368-42ab-4fcf-9938-638ee6cc0734-kube-api-access-rhk92\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.491297 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmbpj\" (UniqueName: \"kubernetes.io/projected/7f6b73eb-d75d-4660-b0bd-3095b7197ec7-kube-api-access-gmbpj\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.892559 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8c2rg" event={"ID":"7f6b73eb-d75d-4660-b0bd-3095b7197ec7","Type":"ContainerDied","Data":"437da5230f19fd8849a73d63b47db1aa54e1f31e65ca1f95bbddb63fd6251637"} Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.892607 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="437da5230f19fd8849a73d63b47db1aa54e1f31e65ca1f95bbddb63fd6251637" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.892678 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8c2rg" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.899819 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0fd7-account-create-update-skfbq" Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.899895 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0fd7-account-create-update-skfbq" event={"ID":"19f1d368-42ab-4fcf-9938-638ee6cc0734","Type":"ContainerDied","Data":"d7753333d4efef8dc5efeb8fa91008f953a2899a48fe87a36d200e079f143d63"} Nov 26 15:02:16 crc kubenswrapper[5200]: I1126 15:02:16.899953 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7753333d4efef8dc5efeb8fa91008f953a2899a48fe87a36d200e079f143d63" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.251125 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d6d878d7c-542xd"] Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.252614 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f6b73eb-d75d-4660-b0bd-3095b7197ec7" containerName="mariadb-database-create" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.252634 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6b73eb-d75d-4660-b0bd-3095b7197ec7" containerName="mariadb-database-create" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.252733 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19f1d368-42ab-4fcf-9938-638ee6cc0734" containerName="mariadb-account-create-update" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.252741 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f1d368-42ab-4fcf-9938-638ee6cc0734" containerName="mariadb-account-create-update" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.252898 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="19f1d368-42ab-4fcf-9938-638ee6cc0734" containerName="mariadb-account-create-update" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.252909 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f6b73eb-d75d-4660-b0bd-3095b7197ec7" containerName="mariadb-database-create" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.265416 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.265276 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d6d878d7c-542xd"] Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.282792 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zs9hq"] Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.299213 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.302005 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-scripts\"" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.302224 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-placement-dockercfg-qcf7q\"" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.303701 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-config-data\"" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.307131 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zs9hq"] Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.330875 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx99l\" (UniqueName: \"kubernetes.io/projected/047cdc76-603e-4cd3-a38c-10d473ec14f3-kube-api-access-bx99l\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.330920 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.330977 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047cdc76-603e-4cd3-a38c-10d473ec14f3-logs\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.331157 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.331216 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-combined-ca-bundle\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.331250 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsc8s\" (UniqueName: \"kubernetes.io/projected/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-kube-api-access-tsc8s\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.331328 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-scripts\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.331403 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-config\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.331472 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-config-data\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.331495 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-dns-svc\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434085 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-combined-ca-bundle\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434232 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsc8s\" (UniqueName: \"kubernetes.io/projected/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-kube-api-access-tsc8s\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434372 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-scripts\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434446 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-config\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434534 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-config-data\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434573 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-dns-svc\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434677 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx99l\" (UniqueName: \"kubernetes.io/projected/047cdc76-603e-4cd3-a38c-10d473ec14f3-kube-api-access-bx99l\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434725 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434868 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047cdc76-603e-4cd3-a38c-10d473ec14f3-logs\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.434981 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.435446 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-config\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.435675 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-dns-svc\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.435868 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.436067 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047cdc76-603e-4cd3-a38c-10d473ec14f3-logs\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.436085 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.445566 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-config-data\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.446885 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-scripts\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.457182 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-combined-ca-bundle\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.457622 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx99l\" (UniqueName: \"kubernetes.io/projected/047cdc76-603e-4cd3-a38c-10d473ec14f3-kube-api-access-bx99l\") pod \"placement-db-sync-zs9hq\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.458100 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsc8s\" (UniqueName: \"kubernetes.io/projected/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-kube-api-access-tsc8s\") pod \"dnsmasq-dns-5d6d878d7c-542xd\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.602193 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:18 crc kubenswrapper[5200]: I1126 15:02:18.628758 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:19 crc kubenswrapper[5200]: I1126 15:02:19.063791 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d6d878d7c-542xd"] Nov 26 15:02:19 crc kubenswrapper[5200]: I1126 15:02:19.139021 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zs9hq"] Nov 26 15:02:19 crc kubenswrapper[5200]: W1126 15:02:19.140874 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod047cdc76_603e_4cd3_a38c_10d473ec14f3.slice/crio-0d36ca8a1024df07e79beda7bc6d0ab619c0f9490f8332247044b150b6640b29 WatchSource:0}: Error finding container 0d36ca8a1024df07e79beda7bc6d0ab619c0f9490f8332247044b150b6640b29: Status 404 returned error can't find the container with id 0d36ca8a1024df07e79beda7bc6d0ab619c0f9490f8332247044b150b6640b29 Nov 26 15:02:19 crc kubenswrapper[5200]: I1126 15:02:19.946884 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zs9hq" event={"ID":"047cdc76-603e-4cd3-a38c-10d473ec14f3","Type":"ContainerStarted","Data":"9af90665d53563ad73d20ae88b657ea66b87675b9ec77088a8a9225b3ee37357"} Nov 26 15:02:19 crc kubenswrapper[5200]: I1126 15:02:19.947280 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zs9hq" event={"ID":"047cdc76-603e-4cd3-a38c-10d473ec14f3","Type":"ContainerStarted","Data":"0d36ca8a1024df07e79beda7bc6d0ab619c0f9490f8332247044b150b6640b29"} Nov 26 15:02:19 crc kubenswrapper[5200]: I1126 15:02:19.950213 5200 generic.go:358] "Generic (PLEG): container finished" podID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" containerID="30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937" exitCode=0 Nov 26 15:02:19 crc kubenswrapper[5200]: I1126 15:02:19.950415 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" event={"ID":"b9ca1ad7-10a2-42a3-8f79-e36941e707d6","Type":"ContainerDied","Data":"30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937"} Nov 26 15:02:19 crc kubenswrapper[5200]: I1126 15:02:19.950513 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" event={"ID":"b9ca1ad7-10a2-42a3-8f79-e36941e707d6","Type":"ContainerStarted","Data":"92372470445ddcf2c0bb9eaf02d0eac0a456115ad2586601d69a437f805cdb26"} Nov 26 15:02:19 crc kubenswrapper[5200]: I1126 15:02:19.974353 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zs9hq" podStartSLOduration=1.97432304 podStartE2EDuration="1.97432304s" podCreationTimestamp="2025-11-26 15:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:02:19.968106676 +0000 UTC m=+5508.647308494" watchObservedRunningTime="2025-11-26 15:02:19.97432304 +0000 UTC m=+5508.653524858" Nov 26 15:02:20 crc kubenswrapper[5200]: I1126 15:02:20.962630 5200 generic.go:358] "Generic (PLEG): container finished" podID="047cdc76-603e-4cd3-a38c-10d473ec14f3" containerID="9af90665d53563ad73d20ae88b657ea66b87675b9ec77088a8a9225b3ee37357" exitCode=0 Nov 26 15:02:20 crc kubenswrapper[5200]: I1126 15:02:20.962753 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zs9hq" event={"ID":"047cdc76-603e-4cd3-a38c-10d473ec14f3","Type":"ContainerDied","Data":"9af90665d53563ad73d20ae88b657ea66b87675b9ec77088a8a9225b3ee37357"} Nov 26 15:02:20 crc kubenswrapper[5200]: I1126 15:02:20.966578 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" event={"ID":"b9ca1ad7-10a2-42a3-8f79-e36941e707d6","Type":"ContainerStarted","Data":"150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61"} Nov 26 15:02:20 crc kubenswrapper[5200]: I1126 15:02:20.966711 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:21 crc kubenswrapper[5200]: I1126 15:02:21.027148 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" podStartSLOduration=3.027122151 podStartE2EDuration="3.027122151s" podCreationTimestamp="2025-11-26 15:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:02:21.022784597 +0000 UTC m=+5509.701986455" watchObservedRunningTime="2025-11-26 15:02:21.027122151 +0000 UTC m=+5509.706323989" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.347244 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.418365 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-config-data\") pod \"047cdc76-603e-4cd3-a38c-10d473ec14f3\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.418543 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx99l\" (UniqueName: \"kubernetes.io/projected/047cdc76-603e-4cd3-a38c-10d473ec14f3-kube-api-access-bx99l\") pod \"047cdc76-603e-4cd3-a38c-10d473ec14f3\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.418579 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047cdc76-603e-4cd3-a38c-10d473ec14f3-logs\") pod \"047cdc76-603e-4cd3-a38c-10d473ec14f3\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.418623 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-combined-ca-bundle\") pod \"047cdc76-603e-4cd3-a38c-10d473ec14f3\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.418752 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-scripts\") pod \"047cdc76-603e-4cd3-a38c-10d473ec14f3\" (UID: \"047cdc76-603e-4cd3-a38c-10d473ec14f3\") " Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.419589 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047cdc76-603e-4cd3-a38c-10d473ec14f3-logs" (OuterVolumeSpecName: "logs") pod "047cdc76-603e-4cd3-a38c-10d473ec14f3" (UID: "047cdc76-603e-4cd3-a38c-10d473ec14f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.426108 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047cdc76-603e-4cd3-a38c-10d473ec14f3-kube-api-access-bx99l" (OuterVolumeSpecName: "kube-api-access-bx99l") pod "047cdc76-603e-4cd3-a38c-10d473ec14f3" (UID: "047cdc76-603e-4cd3-a38c-10d473ec14f3"). InnerVolumeSpecName "kube-api-access-bx99l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.426734 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-scripts" (OuterVolumeSpecName: "scripts") pod "047cdc76-603e-4cd3-a38c-10d473ec14f3" (UID: "047cdc76-603e-4cd3-a38c-10d473ec14f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.443565 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-config-data" (OuterVolumeSpecName: "config-data") pod "047cdc76-603e-4cd3-a38c-10d473ec14f3" (UID: "047cdc76-603e-4cd3-a38c-10d473ec14f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.463547 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "047cdc76-603e-4cd3-a38c-10d473ec14f3" (UID: "047cdc76-603e-4cd3-a38c-10d473ec14f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.520565 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bx99l\" (UniqueName: \"kubernetes.io/projected/047cdc76-603e-4cd3-a38c-10d473ec14f3-kube-api-access-bx99l\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.520597 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047cdc76-603e-4cd3-a38c-10d473ec14f3-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.520609 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.520618 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.520627 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047cdc76-603e-4cd3-a38c-10d473ec14f3-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.985406 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zs9hq" event={"ID":"047cdc76-603e-4cd3-a38c-10d473ec14f3","Type":"ContainerDied","Data":"0d36ca8a1024df07e79beda7bc6d0ab619c0f9490f8332247044b150b6640b29"} Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.985669 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d36ca8a1024df07e79beda7bc6d0ab619c0f9490f8332247044b150b6640b29" Nov 26 15:02:22 crc kubenswrapper[5200]: I1126 15:02:22.985448 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zs9hq" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.085238 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-dd54f8444-drbpt"] Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.086861 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="047cdc76-603e-4cd3-a38c-10d473ec14f3" containerName="placement-db-sync" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.086983 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="047cdc76-603e-4cd3-a38c-10d473ec14f3" containerName="placement-db-sync" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.087266 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="047cdc76-603e-4cd3-a38c-10d473ec14f3" containerName="placement-db-sync" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.097610 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.101994 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dd54f8444-drbpt"] Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.102648 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-scripts\"" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.103316 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-placement-dockercfg-qcf7q\"" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.104050 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-config-data\"" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.235153 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnc7\" (UniqueName: \"kubernetes.io/projected/04b48eb6-187a-4044-abb8-e64b68df8f17-kube-api-access-mwnc7\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.235351 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-combined-ca-bundle\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.235434 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-scripts\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.235479 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-config-data\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.235509 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b48eb6-187a-4044-abb8-e64b68df8f17-logs\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.337467 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-combined-ca-bundle\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.337597 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-scripts\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.337656 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-config-data\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.337709 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b48eb6-187a-4044-abb8-e64b68df8f17-logs\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.337766 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnc7\" (UniqueName: \"kubernetes.io/projected/04b48eb6-187a-4044-abb8-e64b68df8f17-kube-api-access-mwnc7\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.338189 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b48eb6-187a-4044-abb8-e64b68df8f17-logs\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.343571 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-config-data\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.345571 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-combined-ca-bundle\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.356082 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b48eb6-187a-4044-abb8-e64b68df8f17-scripts\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.357393 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnc7\" (UniqueName: \"kubernetes.io/projected/04b48eb6-187a-4044-abb8-e64b68df8f17-kube-api-access-mwnc7\") pod \"placement-dd54f8444-drbpt\" (UID: \"04b48eb6-187a-4044-abb8-e64b68df8f17\") " pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.413675 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:23 crc kubenswrapper[5200]: I1126 15:02:23.895861 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dd54f8444-drbpt"] Nov 26 15:02:24 crc kubenswrapper[5200]: I1126 15:02:24.005142 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd54f8444-drbpt" event={"ID":"04b48eb6-187a-4044-abb8-e64b68df8f17","Type":"ContainerStarted","Data":"94cb7961f6cb65f37680fae4ffbbca08b2d68a752e09e5062e420c8684128a83"} Nov 26 15:02:25 crc kubenswrapper[5200]: I1126 15:02:25.018513 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd54f8444-drbpt" event={"ID":"04b48eb6-187a-4044-abb8-e64b68df8f17","Type":"ContainerStarted","Data":"518f1b7bc4b7af148194ece28567cfb721fedb9aee60ccb81cc282afff7f73e1"} Nov 26 15:02:25 crc kubenswrapper[5200]: I1126 15:02:25.019104 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:25 crc kubenswrapper[5200]: I1126 15:02:25.019130 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd54f8444-drbpt" event={"ID":"04b48eb6-187a-4044-abb8-e64b68df8f17","Type":"ContainerStarted","Data":"6ba1c15f99b2eaa3345cd972ce5fade264a052476e4bc371a4bb625b7553b4b0"} Nov 26 15:02:25 crc kubenswrapper[5200]: I1126 15:02:25.061358 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-dd54f8444-drbpt" podStartSLOduration=2.061329744 podStartE2EDuration="2.061329744s" podCreationTimestamp="2025-11-26 15:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:02:25.04942832 +0000 UTC m=+5513.728630218" watchObservedRunningTime="2025-11-26 15:02:25.061329744 +0000 UTC m=+5513.740531592" Nov 26 15:02:26 crc kubenswrapper[5200]: I1126 15:02:26.032549 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:26 crc kubenswrapper[5200]: I1126 15:02:26.984745 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.081393 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6454d795d5-jcpws"] Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.081824 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" podUID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" containerName="dnsmasq-dns" containerID="cri-o://5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57" gracePeriod=10 Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.610255 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.736384 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-dns-svc\") pod \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.736818 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-nb\") pod \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.736894 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-config\") pod \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.737044 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qs94\" (UniqueName: \"kubernetes.io/projected/f1dc7223-d5b9-4e51-a05e-b5a86affce90-kube-api-access-4qs94\") pod \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.737155 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-sb\") pod \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\" (UID: \"f1dc7223-d5b9-4e51-a05e-b5a86affce90\") " Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.744023 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1dc7223-d5b9-4e51-a05e-b5a86affce90-kube-api-access-4qs94" (OuterVolumeSpecName: "kube-api-access-4qs94") pod "f1dc7223-d5b9-4e51-a05e-b5a86affce90" (UID: "f1dc7223-d5b9-4e51-a05e-b5a86affce90"). InnerVolumeSpecName "kube-api-access-4qs94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.776038 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1dc7223-d5b9-4e51-a05e-b5a86affce90" (UID: "f1dc7223-d5b9-4e51-a05e-b5a86affce90"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.784602 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1dc7223-d5b9-4e51-a05e-b5a86affce90" (UID: "f1dc7223-d5b9-4e51-a05e-b5a86affce90"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.787659 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1dc7223-d5b9-4e51-a05e-b5a86affce90" (UID: "f1dc7223-d5b9-4e51-a05e-b5a86affce90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.791122 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-config" (OuterVolumeSpecName: "config") pod "f1dc7223-d5b9-4e51-a05e-b5a86affce90" (UID: "f1dc7223-d5b9-4e51-a05e-b5a86affce90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.840395 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.840446 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.840461 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4qs94\" (UniqueName: \"kubernetes.io/projected/f1dc7223-d5b9-4e51-a05e-b5a86affce90-kube-api-access-4qs94\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.840476 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:27 crc kubenswrapper[5200]: I1126 15:02:27.840490 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1dc7223-d5b9-4e51-a05e-b5a86affce90-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.063142 5200 generic.go:358] "Generic (PLEG): container finished" podID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" containerID="5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57" exitCode=0 Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.063243 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.063255 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" event={"ID":"f1dc7223-d5b9-4e51-a05e-b5a86affce90","Type":"ContainerDied","Data":"5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57"} Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.063310 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d795d5-jcpws" event={"ID":"f1dc7223-d5b9-4e51-a05e-b5a86affce90","Type":"ContainerDied","Data":"12f536236c606ebef0cc00422b2d3f8f7a3a7d2191166d574bf6a3c628a6d7db"} Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.063334 5200 scope.go:117] "RemoveContainer" containerID="5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.105455 5200 scope.go:117] "RemoveContainer" containerID="257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.106902 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6454d795d5-jcpws"] Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.116391 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6454d795d5-jcpws"] Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.137287 5200 scope.go:117] "RemoveContainer" containerID="5738e47ee40ac9d3049af9fa752126b776493e6b2d135e85890cb16e0c01fb37" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.141881 5200 scope.go:117] "RemoveContainer" containerID="5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57" Nov 26 15:02:28 crc kubenswrapper[5200]: E1126 15:02:28.144393 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57\": container with ID starting with 5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57 not found: ID does not exist" containerID="5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.144436 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57"} err="failed to get container status \"5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57\": rpc error: code = NotFound desc = could not find container \"5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57\": container with ID starting with 5b3bbbac66cdc04571c7134d502b84a690df1f218e239eec013c2fc2af22bc57 not found: ID does not exist" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.144490 5200 scope.go:117] "RemoveContainer" containerID="257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74" Nov 26 15:02:28 crc kubenswrapper[5200]: E1126 15:02:28.145069 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74\": container with ID starting with 257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74 not found: ID does not exist" containerID="257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.145100 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74"} err="failed to get container status \"257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74\": rpc error: code = NotFound desc = could not find container \"257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74\": container with ID starting with 257de15875729627f660c22f977a5d01c470cfbcb95fe013aba5e297d27e9e74 not found: ID does not exist" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.174533 5200 scope.go:117] "RemoveContainer" containerID="c723b510ab6f60d75d696fad653e16b16f43f948b48f8a41054f34bb53558583" Nov 26 15:02:28 crc kubenswrapper[5200]: I1126 15:02:28.989564 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" path="/var/lib/kubelet/pods/f1dc7223-d5b9-4e51-a05e-b5a86affce90/volumes" Nov 26 15:02:33 crc kubenswrapper[5200]: I1126 15:02:33.154286 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:02:33 crc kubenswrapper[5200]: I1126 15:02:33.154899 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:02:42 crc kubenswrapper[5200]: E1126 15:02:42.304150 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1600948 actualBytes=10240 Nov 26 15:02:57 crc kubenswrapper[5200]: I1126 15:02:57.057128 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:02:57 crc kubenswrapper[5200]: I1126 15:02:57.987404 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-dd54f8444-drbpt" Nov 26 15:03:01 crc kubenswrapper[5200]: E1126 15:03:01.331979 5200 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.224:52080->38.102.83.224:42551: write tcp 38.102.83.224:52080->38.102.83.224:42551: write: broken pipe Nov 26 15:03:03 crc kubenswrapper[5200]: I1126 15:03:03.154573 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:03:03 crc kubenswrapper[5200]: I1126 15:03:03.154709 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.670237 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xcw24"] Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.674297 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" containerName="init" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.674348 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" containerName="init" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.674958 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" containerName="dnsmasq-dns" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.674978 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" containerName="dnsmasq-dns" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.675221 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1dc7223-d5b9-4e51-a05e-b5a86affce90" containerName="dnsmasq-dns" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.692017 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xcw24"] Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.692486 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.760912 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-d5lnx"] Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.797326 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d5lnx"] Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.797477 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.837833 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75bjx\" (UniqueName: \"kubernetes.io/projected/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-kube-api-access-75bjx\") pod \"nova-api-db-create-xcw24\" (UID: \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\") " pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.837991 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-operator-scripts\") pod \"nova-api-db-create-xcw24\" (UID: \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\") " pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.867618 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-28zsn"] Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.892893 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-28zsn"] Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.892967 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-69e3-account-create-update-f7fsr"] Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.892969 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.910089 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-69e3-account-create-update-f7fsr"] Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.910231 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.913113 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-db-secret\"" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.940370 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pt6x\" (UniqueName: \"kubernetes.io/projected/710a41bd-779a-4224-82d6-03ba64697e32-kube-api-access-4pt6x\") pod \"nova-cell0-db-create-d5lnx\" (UID: \"710a41bd-779a-4224-82d6-03ba64697e32\") " pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.940893 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710a41bd-779a-4224-82d6-03ba64697e32-operator-scripts\") pod \"nova-cell0-db-create-d5lnx\" (UID: \"710a41bd-779a-4224-82d6-03ba64697e32\") " pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.942460 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75bjx\" (UniqueName: \"kubernetes.io/projected/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-kube-api-access-75bjx\") pod \"nova-api-db-create-xcw24\" (UID: \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\") " pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.942569 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-operator-scripts\") pod \"nova-api-db-create-xcw24\" (UID: \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\") " pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.945164 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-operator-scripts\") pod \"nova-api-db-create-xcw24\" (UID: \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\") " pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:22 crc kubenswrapper[5200]: I1126 15:03:22.961403 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75bjx\" (UniqueName: \"kubernetes.io/projected/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-kube-api-access-75bjx\") pod \"nova-api-db-create-xcw24\" (UID: \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\") " pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.023969 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.051327 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27d0974-edc9-4344-85a9-52cb419cafaa-operator-scripts\") pod \"nova-api-69e3-account-create-update-f7fsr\" (UID: \"e27d0974-edc9-4344-85a9-52cb419cafaa\") " pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.051420 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-operator-scripts\") pod \"nova-cell1-db-create-28zsn\" (UID: \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\") " pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.051452 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjr75\" (UniqueName: \"kubernetes.io/projected/e27d0974-edc9-4344-85a9-52cb419cafaa-kube-api-access-wjr75\") pod \"nova-api-69e3-account-create-update-f7fsr\" (UID: \"e27d0974-edc9-4344-85a9-52cb419cafaa\") " pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.051475 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6jqm\" (UniqueName: \"kubernetes.io/projected/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-kube-api-access-c6jqm\") pod \"nova-cell1-db-create-28zsn\" (UID: \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\") " pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.051517 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pt6x\" (UniqueName: \"kubernetes.io/projected/710a41bd-779a-4224-82d6-03ba64697e32-kube-api-access-4pt6x\") pod \"nova-cell0-db-create-d5lnx\" (UID: \"710a41bd-779a-4224-82d6-03ba64697e32\") " pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.051570 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710a41bd-779a-4224-82d6-03ba64697e32-operator-scripts\") pod \"nova-cell0-db-create-d5lnx\" (UID: \"710a41bd-779a-4224-82d6-03ba64697e32\") " pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.052460 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710a41bd-779a-4224-82d6-03ba64697e32-operator-scripts\") pod \"nova-cell0-db-create-d5lnx\" (UID: \"710a41bd-779a-4224-82d6-03ba64697e32\") " pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.073984 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pt6x\" (UniqueName: \"kubernetes.io/projected/710a41bd-779a-4224-82d6-03ba64697e32-kube-api-access-4pt6x\") pod \"nova-cell0-db-create-d5lnx\" (UID: \"710a41bd-779a-4224-82d6-03ba64697e32\") " pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.089745 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-0e9e-account-create-update-plwlj"] Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.144220 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.155997 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27d0974-edc9-4344-85a9-52cb419cafaa-operator-scripts\") pod \"nova-api-69e3-account-create-update-f7fsr\" (UID: \"e27d0974-edc9-4344-85a9-52cb419cafaa\") " pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.156078 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-operator-scripts\") pod \"nova-cell1-db-create-28zsn\" (UID: \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\") " pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.156129 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjr75\" (UniqueName: \"kubernetes.io/projected/e27d0974-edc9-4344-85a9-52cb419cafaa-kube-api-access-wjr75\") pod \"nova-api-69e3-account-create-update-f7fsr\" (UID: \"e27d0974-edc9-4344-85a9-52cb419cafaa\") " pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.156167 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6jqm\" (UniqueName: \"kubernetes.io/projected/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-kube-api-access-c6jqm\") pod \"nova-cell1-db-create-28zsn\" (UID: \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\") " pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.157643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27d0974-edc9-4344-85a9-52cb419cafaa-operator-scripts\") pod \"nova-api-69e3-account-create-update-f7fsr\" (UID: \"e27d0974-edc9-4344-85a9-52cb419cafaa\") " pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.158537 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-operator-scripts\") pod \"nova-cell1-db-create-28zsn\" (UID: \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\") " pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.183958 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6jqm\" (UniqueName: \"kubernetes.io/projected/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-kube-api-access-c6jqm\") pod \"nova-cell1-db-create-28zsn\" (UID: \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\") " pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.184328 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjr75\" (UniqueName: \"kubernetes.io/projected/e27d0974-edc9-4344-85a9-52cb419cafaa-kube-api-access-wjr75\") pod \"nova-api-69e3-account-create-update-f7fsr\" (UID: \"e27d0974-edc9-4344-85a9-52cb419cafaa\") " pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.209867 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.229084 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.243190 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0e9e-account-create-update-plwlj"] Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.243358 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.256899 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-db-secret\"" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.306771 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0192-account-create-update-5clsm"] Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.349572 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0192-account-create-update-5clsm"] Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.349770 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.352816 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-db-secret\"" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.359088 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94225dcd-0095-452e-a108-5e98828f20a4-operator-scripts\") pod \"nova-cell0-0e9e-account-create-update-plwlj\" (UID: \"94225dcd-0095-452e-a108-5e98828f20a4\") " pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.359155 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp7sq\" (UniqueName: \"kubernetes.io/projected/94225dcd-0095-452e-a108-5e98828f20a4-kube-api-access-bp7sq\") pod \"nova-cell0-0e9e-account-create-update-plwlj\" (UID: \"94225dcd-0095-452e-a108-5e98828f20a4\") " pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.461730 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcnbm\" (UniqueName: \"kubernetes.io/projected/f88c5fd6-6888-43d2-8772-2055f386b9e7-kube-api-access-vcnbm\") pod \"nova-cell1-0192-account-create-update-5clsm\" (UID: \"f88c5fd6-6888-43d2-8772-2055f386b9e7\") " pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.462207 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88c5fd6-6888-43d2-8772-2055f386b9e7-operator-scripts\") pod \"nova-cell1-0192-account-create-update-5clsm\" (UID: \"f88c5fd6-6888-43d2-8772-2055f386b9e7\") " pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.462379 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94225dcd-0095-452e-a108-5e98828f20a4-operator-scripts\") pod \"nova-cell0-0e9e-account-create-update-plwlj\" (UID: \"94225dcd-0095-452e-a108-5e98828f20a4\") " pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.462466 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp7sq\" (UniqueName: \"kubernetes.io/projected/94225dcd-0095-452e-a108-5e98828f20a4-kube-api-access-bp7sq\") pod \"nova-cell0-0e9e-account-create-update-plwlj\" (UID: \"94225dcd-0095-452e-a108-5e98828f20a4\") " pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.464532 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94225dcd-0095-452e-a108-5e98828f20a4-operator-scripts\") pod \"nova-cell0-0e9e-account-create-update-plwlj\" (UID: \"94225dcd-0095-452e-a108-5e98828f20a4\") " pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.484073 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp7sq\" (UniqueName: \"kubernetes.io/projected/94225dcd-0095-452e-a108-5e98828f20a4-kube-api-access-bp7sq\") pod \"nova-cell0-0e9e-account-create-update-plwlj\" (UID: \"94225dcd-0095-452e-a108-5e98828f20a4\") " pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.564535 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcnbm\" (UniqueName: \"kubernetes.io/projected/f88c5fd6-6888-43d2-8772-2055f386b9e7-kube-api-access-vcnbm\") pod \"nova-cell1-0192-account-create-update-5clsm\" (UID: \"f88c5fd6-6888-43d2-8772-2055f386b9e7\") " pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.564620 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88c5fd6-6888-43d2-8772-2055f386b9e7-operator-scripts\") pod \"nova-cell1-0192-account-create-update-5clsm\" (UID: \"f88c5fd6-6888-43d2-8772-2055f386b9e7\") " pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.565502 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88c5fd6-6888-43d2-8772-2055f386b9e7-operator-scripts\") pod \"nova-cell1-0192-account-create-update-5clsm\" (UID: \"f88c5fd6-6888-43d2-8772-2055f386b9e7\") " pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.581538 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcnbm\" (UniqueName: \"kubernetes.io/projected/f88c5fd6-6888-43d2-8772-2055f386b9e7-kube-api-access-vcnbm\") pod \"nova-cell1-0192-account-create-update-5clsm\" (UID: \"f88c5fd6-6888-43d2-8772-2055f386b9e7\") " pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.600639 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.681739 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xcw24"] Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.692342 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.777765 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d5lnx"] Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.865107 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-28zsn"] Nov 26 15:03:23 crc kubenswrapper[5200]: I1126 15:03:23.890497 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-69e3-account-create-update-f7fsr"] Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.056514 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0e9e-account-create-update-plwlj"] Nov 26 15:03:24 crc kubenswrapper[5200]: W1126 15:03:24.077227 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94225dcd_0095_452e_a108_5e98828f20a4.slice/crio-924dfb975dad254a00923014e51a92869479d51ce098a720ce7b4cdb8821a6cd WatchSource:0}: Error finding container 924dfb975dad254a00923014e51a92869479d51ce098a720ce7b4cdb8821a6cd: Status 404 returned error can't find the container with id 924dfb975dad254a00923014e51a92869479d51ce098a720ce7b4cdb8821a6cd Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.315281 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0192-account-create-update-5clsm"] Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.665476 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0192-account-create-update-5clsm" event={"ID":"f88c5fd6-6888-43d2-8772-2055f386b9e7","Type":"ContainerStarted","Data":"01c81f23db63b51133d06f2f43f6c206e7b4808b77a6d396abe5640308e02982"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.667462 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d5lnx" event={"ID":"710a41bd-779a-4224-82d6-03ba64697e32","Type":"ContainerStarted","Data":"74661dd015674ffae44b2fe418694189e4802d55a0ea3924630f6d531a27914c"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.667507 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d5lnx" event={"ID":"710a41bd-779a-4224-82d6-03ba64697e32","Type":"ContainerStarted","Data":"cd412ca7dedfb135b41ce970db87ad17b0236489898ba9cef058c6b8d64c74f1"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.669633 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xcw24" event={"ID":"0595c8ee-0fcc-46d9-a97a-fff2b66a5284","Type":"ContainerStarted","Data":"81c0b0005c659bc955e129114ed7e262466dbd299273de69a8402c6c1df1204a"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.669795 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xcw24" event={"ID":"0595c8ee-0fcc-46d9-a97a-fff2b66a5284","Type":"ContainerStarted","Data":"1337fe4965467d9f0e69664a327138cb05780d2f5897beecb94047708829eff5"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.671251 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-69e3-account-create-update-f7fsr" event={"ID":"e27d0974-edc9-4344-85a9-52cb419cafaa","Type":"ContainerStarted","Data":"53e39ae1a985831ea97c52a05bf5ae272295110be45a57975961604f31fbfbdb"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.671277 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-69e3-account-create-update-f7fsr" event={"ID":"e27d0974-edc9-4344-85a9-52cb419cafaa","Type":"ContainerStarted","Data":"5dd1c41c72a229dd08249de99b60bcf7865d533c17289d64497071f29bd6a0ba"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.672784 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-28zsn" event={"ID":"6c3d2845-4af4-4f14-ade5-afc0530f2ed1","Type":"ContainerStarted","Data":"5ee778c3eff81bdd1d8c96e4bad8259b8949eeedc0b3b26b9148196c3340228a"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.672819 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-28zsn" event={"ID":"6c3d2845-4af4-4f14-ade5-afc0530f2ed1","Type":"ContainerStarted","Data":"f1ff417abf88472f30260bd50a7781f9e0df5bfb3ba67a00027bd8501eb57a4d"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.674410 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" event={"ID":"94225dcd-0095-452e-a108-5e98828f20a4","Type":"ContainerStarted","Data":"531d4cfc4a86580928b5de6f8775840ca6eeb81f9c7dfc187c6c5a2061edb467"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.674448 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" event={"ID":"94225dcd-0095-452e-a108-5e98828f20a4","Type":"ContainerStarted","Data":"924dfb975dad254a00923014e51a92869479d51ce098a720ce7b4cdb8821a6cd"} Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.689480 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-xcw24" podStartSLOduration=2.6894595900000002 podStartE2EDuration="2.68945959s" podCreationTimestamp="2025-11-26 15:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:24.682098156 +0000 UTC m=+5573.361299984" watchObservedRunningTime="2025-11-26 15:03:24.68945959 +0000 UTC m=+5573.368661398" Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.703450 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-69e3-account-create-update-f7fsr" podStartSLOduration=2.703432789 podStartE2EDuration="2.703432789s" podCreationTimestamp="2025-11-26 15:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:24.696880426 +0000 UTC m=+5573.376082254" watchObservedRunningTime="2025-11-26 15:03:24.703432789 +0000 UTC m=+5573.382634607" Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.716350 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" podStartSLOduration=1.7163334689999998 podStartE2EDuration="1.716333469s" podCreationTimestamp="2025-11-26 15:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:24.711836951 +0000 UTC m=+5573.391038789" watchObservedRunningTime="2025-11-26 15:03:24.716333469 +0000 UTC m=+5573.395535287" Nov 26 15:03:24 crc kubenswrapper[5200]: I1126 15:03:24.734769 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-28zsn" podStartSLOduration=2.734744755 podStartE2EDuration="2.734744755s" podCreationTimestamp="2025-11-26 15:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:24.72694621 +0000 UTC m=+5573.406148048" watchObservedRunningTime="2025-11-26 15:03:24.734744755 +0000 UTC m=+5573.413946573" Nov 26 15:03:25 crc kubenswrapper[5200]: I1126 15:03:25.687262 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0192-account-create-update-5clsm" event={"ID":"f88c5fd6-6888-43d2-8772-2055f386b9e7","Type":"ContainerStarted","Data":"c976b0181a762b4bfac461ad1ebb6762b393ac061e8d00f04d9ae54d918d5d61"} Nov 26 15:03:25 crc kubenswrapper[5200]: I1126 15:03:25.691605 5200 generic.go:358] "Generic (PLEG): container finished" podID="710a41bd-779a-4224-82d6-03ba64697e32" containerID="74661dd015674ffae44b2fe418694189e4802d55a0ea3924630f6d531a27914c" exitCode=0 Nov 26 15:03:25 crc kubenswrapper[5200]: I1126 15:03:25.691769 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d5lnx" event={"ID":"710a41bd-779a-4224-82d6-03ba64697e32","Type":"ContainerDied","Data":"74661dd015674ffae44b2fe418694189e4802d55a0ea3924630f6d531a27914c"} Nov 26 15:03:25 crc kubenswrapper[5200]: I1126 15:03:25.695513 5200 generic.go:358] "Generic (PLEG): container finished" podID="0595c8ee-0fcc-46d9-a97a-fff2b66a5284" containerID="81c0b0005c659bc955e129114ed7e262466dbd299273de69a8402c6c1df1204a" exitCode=0 Nov 26 15:03:25 crc kubenswrapper[5200]: I1126 15:03:25.695765 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xcw24" event={"ID":"0595c8ee-0fcc-46d9-a97a-fff2b66a5284","Type":"ContainerDied","Data":"81c0b0005c659bc955e129114ed7e262466dbd299273de69a8402c6c1df1204a"} Nov 26 15:03:25 crc kubenswrapper[5200]: I1126 15:03:25.710388 5200 generic.go:358] "Generic (PLEG): container finished" podID="6c3d2845-4af4-4f14-ade5-afc0530f2ed1" containerID="5ee778c3eff81bdd1d8c96e4bad8259b8949eeedc0b3b26b9148196c3340228a" exitCode=0 Nov 26 15:03:25 crc kubenswrapper[5200]: I1126 15:03:25.710455 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-28zsn" event={"ID":"6c3d2845-4af4-4f14-ade5-afc0530f2ed1","Type":"ContainerDied","Data":"5ee778c3eff81bdd1d8c96e4bad8259b8949eeedc0b3b26b9148196c3340228a"} Nov 26 15:03:25 crc kubenswrapper[5200]: I1126 15:03:25.716965 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-0192-account-create-update-5clsm" podStartSLOduration=2.716948043 podStartE2EDuration="2.716948043s" podCreationTimestamp="2025-11-26 15:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:25.71039481 +0000 UTC m=+5574.389596648" watchObservedRunningTime="2025-11-26 15:03:25.716948043 +0000 UTC m=+5574.396149861" Nov 26 15:03:26 crc kubenswrapper[5200]: I1126 15:03:26.728935 5200 generic.go:358] "Generic (PLEG): container finished" podID="e27d0974-edc9-4344-85a9-52cb419cafaa" containerID="53e39ae1a985831ea97c52a05bf5ae272295110be45a57975961604f31fbfbdb" exitCode=0 Nov 26 15:03:26 crc kubenswrapper[5200]: I1126 15:03:26.729080 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-69e3-account-create-update-f7fsr" event={"ID":"e27d0974-edc9-4344-85a9-52cb419cafaa","Type":"ContainerDied","Data":"53e39ae1a985831ea97c52a05bf5ae272295110be45a57975961604f31fbfbdb"} Nov 26 15:03:26 crc kubenswrapper[5200]: I1126 15:03:26.732121 5200 generic.go:358] "Generic (PLEG): container finished" podID="94225dcd-0095-452e-a108-5e98828f20a4" containerID="531d4cfc4a86580928b5de6f8775840ca6eeb81f9c7dfc187c6c5a2061edb467" exitCode=0 Nov 26 15:03:26 crc kubenswrapper[5200]: I1126 15:03:26.732336 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" event={"ID":"94225dcd-0095-452e-a108-5e98828f20a4","Type":"ContainerDied","Data":"531d4cfc4a86580928b5de6f8775840ca6eeb81f9c7dfc187c6c5a2061edb467"} Nov 26 15:03:26 crc kubenswrapper[5200]: I1126 15:03:26.736901 5200 generic.go:358] "Generic (PLEG): container finished" podID="f88c5fd6-6888-43d2-8772-2055f386b9e7" containerID="c976b0181a762b4bfac461ad1ebb6762b393ac061e8d00f04d9ae54d918d5d61" exitCode=0 Nov 26 15:03:26 crc kubenswrapper[5200]: I1126 15:03:26.737267 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0192-account-create-update-5clsm" event={"ID":"f88c5fd6-6888-43d2-8772-2055f386b9e7","Type":"ContainerDied","Data":"c976b0181a762b4bfac461ad1ebb6762b393ac061e8d00f04d9ae54d918d5d61"} Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.240778 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.247477 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.251442 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6jqm\" (UniqueName: \"kubernetes.io/projected/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-kube-api-access-c6jqm\") pod \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\" (UID: \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\") " Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.251561 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-operator-scripts\") pod \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\" (UID: \"6c3d2845-4af4-4f14-ade5-afc0530f2ed1\") " Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.252591 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c3d2845-4af4-4f14-ade5-afc0530f2ed1" (UID: "6c3d2845-4af4-4f14-ade5-afc0530f2ed1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.258778 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.264271 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-kube-api-access-c6jqm" (OuterVolumeSpecName: "kube-api-access-c6jqm") pod "6c3d2845-4af4-4f14-ade5-afc0530f2ed1" (UID: "6c3d2845-4af4-4f14-ade5-afc0530f2ed1"). InnerVolumeSpecName "kube-api-access-c6jqm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.353147 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-operator-scripts\") pod \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\" (UID: \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\") " Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.353679 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710a41bd-779a-4224-82d6-03ba64697e32-operator-scripts\") pod \"710a41bd-779a-4224-82d6-03ba64697e32\" (UID: \"710a41bd-779a-4224-82d6-03ba64697e32\") " Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.353872 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75bjx\" (UniqueName: \"kubernetes.io/projected/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-kube-api-access-75bjx\") pod \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\" (UID: \"0595c8ee-0fcc-46d9-a97a-fff2b66a5284\") " Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.354173 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pt6x\" (UniqueName: \"kubernetes.io/projected/710a41bd-779a-4224-82d6-03ba64697e32-kube-api-access-4pt6x\") pod \"710a41bd-779a-4224-82d6-03ba64697e32\" (UID: \"710a41bd-779a-4224-82d6-03ba64697e32\") " Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.354346 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710a41bd-779a-4224-82d6-03ba64697e32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "710a41bd-779a-4224-82d6-03ba64697e32" (UID: "710a41bd-779a-4224-82d6-03ba64697e32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.354673 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0595c8ee-0fcc-46d9-a97a-fff2b66a5284" (UID: "0595c8ee-0fcc-46d9-a97a-fff2b66a5284"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.354924 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.355000 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710a41bd-779a-4224-82d6-03ba64697e32-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.355058 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6jqm\" (UniqueName: \"kubernetes.io/projected/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-kube-api-access-c6jqm\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.355114 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3d2845-4af4-4f14-ade5-afc0530f2ed1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.357492 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-kube-api-access-75bjx" (OuterVolumeSpecName: "kube-api-access-75bjx") pod "0595c8ee-0fcc-46d9-a97a-fff2b66a5284" (UID: "0595c8ee-0fcc-46d9-a97a-fff2b66a5284"). InnerVolumeSpecName "kube-api-access-75bjx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.357611 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710a41bd-779a-4224-82d6-03ba64697e32-kube-api-access-4pt6x" (OuterVolumeSpecName: "kube-api-access-4pt6x") pod "710a41bd-779a-4224-82d6-03ba64697e32" (UID: "710a41bd-779a-4224-82d6-03ba64697e32"). InnerVolumeSpecName "kube-api-access-4pt6x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.456834 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75bjx\" (UniqueName: \"kubernetes.io/projected/0595c8ee-0fcc-46d9-a97a-fff2b66a5284-kube-api-access-75bjx\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.456873 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4pt6x\" (UniqueName: \"kubernetes.io/projected/710a41bd-779a-4224-82d6-03ba64697e32-kube-api-access-4pt6x\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.757677 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-28zsn" event={"ID":"6c3d2845-4af4-4f14-ade5-afc0530f2ed1","Type":"ContainerDied","Data":"f1ff417abf88472f30260bd50a7781f9e0df5bfb3ba67a00027bd8501eb57a4d"} Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.759858 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1ff417abf88472f30260bd50a7781f9e0df5bfb3ba67a00027bd8501eb57a4d" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.759932 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-28zsn" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.763247 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d5lnx" event={"ID":"710a41bd-779a-4224-82d6-03ba64697e32","Type":"ContainerDied","Data":"cd412ca7dedfb135b41ce970db87ad17b0236489898ba9cef058c6b8d64c74f1"} Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.763379 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd412ca7dedfb135b41ce970db87ad17b0236489898ba9cef058c6b8d64c74f1" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.763388 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d5lnx" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.767671 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xcw24" Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.767750 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xcw24" event={"ID":"0595c8ee-0fcc-46d9-a97a-fff2b66a5284","Type":"ContainerDied","Data":"1337fe4965467d9f0e69664a327138cb05780d2f5897beecb94047708829eff5"} Nov 26 15:03:27 crc kubenswrapper[5200]: I1126 15:03:27.767791 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1337fe4965467d9f0e69664a327138cb05780d2f5897beecb94047708829eff5" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.068046 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.172731 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjr75\" (UniqueName: \"kubernetes.io/projected/e27d0974-edc9-4344-85a9-52cb419cafaa-kube-api-access-wjr75\") pod \"e27d0974-edc9-4344-85a9-52cb419cafaa\" (UID: \"e27d0974-edc9-4344-85a9-52cb419cafaa\") " Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.172971 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27d0974-edc9-4344-85a9-52cb419cafaa-operator-scripts\") pod \"e27d0974-edc9-4344-85a9-52cb419cafaa\" (UID: \"e27d0974-edc9-4344-85a9-52cb419cafaa\") " Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.173759 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e27d0974-edc9-4344-85a9-52cb419cafaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e27d0974-edc9-4344-85a9-52cb419cafaa" (UID: "e27d0974-edc9-4344-85a9-52cb419cafaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.178284 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27d0974-edc9-4344-85a9-52cb419cafaa-kube-api-access-wjr75" (OuterVolumeSpecName: "kube-api-access-wjr75") pod "e27d0974-edc9-4344-85a9-52cb419cafaa" (UID: "e27d0974-edc9-4344-85a9-52cb419cafaa"). InnerVolumeSpecName "kube-api-access-wjr75". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.275436 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wjr75\" (UniqueName: \"kubernetes.io/projected/e27d0974-edc9-4344-85a9-52cb419cafaa-kube-api-access-wjr75\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.275471 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27d0974-edc9-4344-85a9-52cb419cafaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.301057 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.308534 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.376590 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcnbm\" (UniqueName: \"kubernetes.io/projected/f88c5fd6-6888-43d2-8772-2055f386b9e7-kube-api-access-vcnbm\") pod \"f88c5fd6-6888-43d2-8772-2055f386b9e7\" (UID: \"f88c5fd6-6888-43d2-8772-2055f386b9e7\") " Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.376642 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88c5fd6-6888-43d2-8772-2055f386b9e7-operator-scripts\") pod \"f88c5fd6-6888-43d2-8772-2055f386b9e7\" (UID: \"f88c5fd6-6888-43d2-8772-2055f386b9e7\") " Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.376796 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp7sq\" (UniqueName: \"kubernetes.io/projected/94225dcd-0095-452e-a108-5e98828f20a4-kube-api-access-bp7sq\") pod \"94225dcd-0095-452e-a108-5e98828f20a4\" (UID: \"94225dcd-0095-452e-a108-5e98828f20a4\") " Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.376924 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94225dcd-0095-452e-a108-5e98828f20a4-operator-scripts\") pod \"94225dcd-0095-452e-a108-5e98828f20a4\" (UID: \"94225dcd-0095-452e-a108-5e98828f20a4\") " Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.377291 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f88c5fd6-6888-43d2-8772-2055f386b9e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f88c5fd6-6888-43d2-8772-2055f386b9e7" (UID: "f88c5fd6-6888-43d2-8772-2055f386b9e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.377498 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94225dcd-0095-452e-a108-5e98828f20a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94225dcd-0095-452e-a108-5e98828f20a4" (UID: "94225dcd-0095-452e-a108-5e98828f20a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.379675 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88c5fd6-6888-43d2-8772-2055f386b9e7-kube-api-access-vcnbm" (OuterVolumeSpecName: "kube-api-access-vcnbm") pod "f88c5fd6-6888-43d2-8772-2055f386b9e7" (UID: "f88c5fd6-6888-43d2-8772-2055f386b9e7"). InnerVolumeSpecName "kube-api-access-vcnbm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.380167 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94225dcd-0095-452e-a108-5e98828f20a4-kube-api-access-bp7sq" (OuterVolumeSpecName: "kube-api-access-bp7sq") pod "94225dcd-0095-452e-a108-5e98828f20a4" (UID: "94225dcd-0095-452e-a108-5e98828f20a4"). InnerVolumeSpecName "kube-api-access-bp7sq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.479134 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bp7sq\" (UniqueName: \"kubernetes.io/projected/94225dcd-0095-452e-a108-5e98828f20a4-kube-api-access-bp7sq\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.479171 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94225dcd-0095-452e-a108-5e98828f20a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.479180 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vcnbm\" (UniqueName: \"kubernetes.io/projected/f88c5fd6-6888-43d2-8772-2055f386b9e7-kube-api-access-vcnbm\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.479190 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f88c5fd6-6888-43d2-8772-2055f386b9e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.788508 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0192-account-create-update-5clsm" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.788528 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0192-account-create-update-5clsm" event={"ID":"f88c5fd6-6888-43d2-8772-2055f386b9e7","Type":"ContainerDied","Data":"01c81f23db63b51133d06f2f43f6c206e7b4808b77a6d396abe5640308e02982"} Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.789059 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01c81f23db63b51133d06f2f43f6c206e7b4808b77a6d396abe5640308e02982" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.790606 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-69e3-account-create-update-f7fsr" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.790626 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-69e3-account-create-update-f7fsr" event={"ID":"e27d0974-edc9-4344-85a9-52cb419cafaa","Type":"ContainerDied","Data":"5dd1c41c72a229dd08249de99b60bcf7865d533c17289d64497071f29bd6a0ba"} Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.790826 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd1c41c72a229dd08249de99b60bcf7865d533c17289d64497071f29bd6a0ba" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.792712 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" event={"ID":"94225dcd-0095-452e-a108-5e98828f20a4","Type":"ContainerDied","Data":"924dfb975dad254a00923014e51a92869479d51ce098a720ce7b4cdb8821a6cd"} Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.792743 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924dfb975dad254a00923014e51a92869479d51ce098a720ce7b4cdb8821a6cd" Nov 26 15:03:28 crc kubenswrapper[5200]: I1126 15:03:28.792746 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0e9e-account-create-update-plwlj" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.154375 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.154715 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.154763 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.155383 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c57089b0cc90614ea2b7d202afb3706d3131737053c127ff99d3fcebdf0010ac"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.155431 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://c57089b0cc90614ea2b7d202afb3706d3131737053c127ff99d3fcebdf0010ac" gracePeriod=600 Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.380545 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bhmn"] Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382077 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c3d2845-4af4-4f14-ade5-afc0530f2ed1" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382097 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3d2845-4af4-4f14-ade5-afc0530f2ed1" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382109 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94225dcd-0095-452e-a108-5e98828f20a4" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382116 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="94225dcd-0095-452e-a108-5e98828f20a4" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382131 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e27d0974-edc9-4344-85a9-52cb419cafaa" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382137 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27d0974-edc9-4344-85a9-52cb419cafaa" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382154 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f88c5fd6-6888-43d2-8772-2055f386b9e7" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382160 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88c5fd6-6888-43d2-8772-2055f386b9e7" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382172 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0595c8ee-0fcc-46d9-a97a-fff2b66a5284" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382177 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0595c8ee-0fcc-46d9-a97a-fff2b66a5284" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382209 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="710a41bd-779a-4224-82d6-03ba64697e32" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382215 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="710a41bd-779a-4224-82d6-03ba64697e32" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382375 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e27d0974-edc9-4344-85a9-52cb419cafaa" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382393 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0595c8ee-0fcc-46d9-a97a-fff2b66a5284" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382406 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f88c5fd6-6888-43d2-8772-2055f386b9e7" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382414 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c3d2845-4af4-4f14-ade5-afc0530f2ed1" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382423 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="94225dcd-0095-452e-a108-5e98828f20a4" containerName="mariadb-account-create-update" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.382432 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="710a41bd-779a-4224-82d6-03ba64697e32" containerName="mariadb-database-create" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.451363 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bhmn"] Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.451498 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.453772 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-config-data\"" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.455322 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-scripts\"" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.455515 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-nova-dockercfg-klwr9\"" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.583247 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.583294 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-config-data\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.583597 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-scripts\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.583778 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjgpt\" (UniqueName: \"kubernetes.io/projected/bd766e85-0f40-4106-ba90-a82abaf7e00b-kube-api-access-bjgpt\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.685915 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.686243 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-config-data\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.686300 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-scripts\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.686336 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjgpt\" (UniqueName: \"kubernetes.io/projected/bd766e85-0f40-4106-ba90-a82abaf7e00b-kube-api-access-bjgpt\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.691976 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-scripts\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.692130 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-config-data\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.696722 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.708287 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjgpt\" (UniqueName: \"kubernetes.io/projected/bd766e85-0f40-4106-ba90-a82abaf7e00b-kube-api-access-bjgpt\") pod \"nova-cell0-conductor-db-sync-6bhmn\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.801158 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.842334 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="c57089b0cc90614ea2b7d202afb3706d3131737053c127ff99d3fcebdf0010ac" exitCode=0 Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.842529 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"c57089b0cc90614ea2b7d202afb3706d3131737053c127ff99d3fcebdf0010ac"} Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.842594 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd"} Nov 26 15:03:33 crc kubenswrapper[5200]: I1126 15:03:33.842616 5200 scope.go:117] "RemoveContainer" containerID="fb8c16c9845d9e4e2c253cdbf683634a21eec093f10d17a1722235989a6388cc" Nov 26 15:03:34 crc kubenswrapper[5200]: I1126 15:03:34.251373 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bhmn"] Nov 26 15:03:34 crc kubenswrapper[5200]: W1126 15:03:34.261533 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd766e85_0f40_4106_ba90_a82abaf7e00b.slice/crio-001d2f7f77a5d5db4fa5b1d31cdb751c6a1638d52a92bb0c599a2be595690c8c WatchSource:0}: Error finding container 001d2f7f77a5d5db4fa5b1d31cdb751c6a1638d52a92bb0c599a2be595690c8c: Status 404 returned error can't find the container with id 001d2f7f77a5d5db4fa5b1d31cdb751c6a1638d52a92bb0c599a2be595690c8c Nov 26 15:03:34 crc kubenswrapper[5200]: I1126 15:03:34.851445 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bhmn" event={"ID":"bd766e85-0f40-4106-ba90-a82abaf7e00b","Type":"ContainerStarted","Data":"6042d147398272a3cf66bf47c8885391a45df54976aec23eeeb0c6df8ca23b84"} Nov 26 15:03:34 crc kubenswrapper[5200]: I1126 15:03:34.851884 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bhmn" event={"ID":"bd766e85-0f40-4106-ba90-a82abaf7e00b","Type":"ContainerStarted","Data":"001d2f7f77a5d5db4fa5b1d31cdb751c6a1638d52a92bb0c599a2be595690c8c"} Nov 26 15:03:41 crc kubenswrapper[5200]: I1126 15:03:41.936931 5200 generic.go:358] "Generic (PLEG): container finished" podID="bd766e85-0f40-4106-ba90-a82abaf7e00b" containerID="6042d147398272a3cf66bf47c8885391a45df54976aec23eeeb0c6df8ca23b84" exitCode=0 Nov 26 15:03:41 crc kubenswrapper[5200]: I1126 15:03:41.937064 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bhmn" event={"ID":"bd766e85-0f40-4106-ba90-a82abaf7e00b","Type":"ContainerDied","Data":"6042d147398272a3cf66bf47c8885391a45df54976aec23eeeb0c6df8ca23b84"} Nov 26 15:03:42 crc kubenswrapper[5200]: E1126 15:03:42.331194 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1608941 actualBytes=10240 Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.306219 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.390253 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-config-data\") pod \"bd766e85-0f40-4106-ba90-a82abaf7e00b\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.390373 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-combined-ca-bundle\") pod \"bd766e85-0f40-4106-ba90-a82abaf7e00b\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.390799 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjgpt\" (UniqueName: \"kubernetes.io/projected/bd766e85-0f40-4106-ba90-a82abaf7e00b-kube-api-access-bjgpt\") pod \"bd766e85-0f40-4106-ba90-a82abaf7e00b\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.390839 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-scripts\") pod \"bd766e85-0f40-4106-ba90-a82abaf7e00b\" (UID: \"bd766e85-0f40-4106-ba90-a82abaf7e00b\") " Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.397126 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-scripts" (OuterVolumeSpecName: "scripts") pod "bd766e85-0f40-4106-ba90-a82abaf7e00b" (UID: "bd766e85-0f40-4106-ba90-a82abaf7e00b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.397812 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd766e85-0f40-4106-ba90-a82abaf7e00b-kube-api-access-bjgpt" (OuterVolumeSpecName: "kube-api-access-bjgpt") pod "bd766e85-0f40-4106-ba90-a82abaf7e00b" (UID: "bd766e85-0f40-4106-ba90-a82abaf7e00b"). InnerVolumeSpecName "kube-api-access-bjgpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.417371 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd766e85-0f40-4106-ba90-a82abaf7e00b" (UID: "bd766e85-0f40-4106-ba90-a82abaf7e00b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.421798 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-config-data" (OuterVolumeSpecName: "config-data") pod "bd766e85-0f40-4106-ba90-a82abaf7e00b" (UID: "bd766e85-0f40-4106-ba90-a82abaf7e00b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.493808 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bjgpt\" (UniqueName: \"kubernetes.io/projected/bd766e85-0f40-4106-ba90-a82abaf7e00b-kube-api-access-bjgpt\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.493855 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.493895 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.493910 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd766e85-0f40-4106-ba90-a82abaf7e00b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.961520 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bhmn" Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.961546 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bhmn" event={"ID":"bd766e85-0f40-4106-ba90-a82abaf7e00b","Type":"ContainerDied","Data":"001d2f7f77a5d5db4fa5b1d31cdb751c6a1638d52a92bb0c599a2be595690c8c"} Nov 26 15:03:43 crc kubenswrapper[5200]: I1126 15:03:43.961599 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="001d2f7f77a5d5db4fa5b1d31cdb751c6a1638d52a92bb0c599a2be595690c8c" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.076192 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.077266 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd766e85-0f40-4106-ba90-a82abaf7e00b" containerName="nova-cell0-conductor-db-sync" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.077287 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd766e85-0f40-4106-ba90-a82abaf7e00b" containerName="nova-cell0-conductor-db-sync" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.077485 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd766e85-0f40-4106-ba90-a82abaf7e00b" containerName="nova-cell0-conductor-db-sync" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.083850 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.086322 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-config-data\"" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.087728 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.087912 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-nova-dockercfg-klwr9\"" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.207431 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc7l8\" (UniqueName: \"kubernetes.io/projected/96f31a1f-212f-4e9d-bda1-91f2bde393ba-kube-api-access-zc7l8\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.207585 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.207632 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.309256 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.309316 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.309405 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc7l8\" (UniqueName: \"kubernetes.io/projected/96f31a1f-212f-4e9d-bda1-91f2bde393ba-kube-api-access-zc7l8\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.315895 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.316204 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.326614 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc7l8\" (UniqueName: \"kubernetes.io/projected/96f31a1f-212f-4e9d-bda1-91f2bde393ba-kube-api-access-zc7l8\") pod \"nova-cell0-conductor-0\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.406397 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.839666 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:03:44 crc kubenswrapper[5200]: W1126 15:03:44.848336 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96f31a1f_212f_4e9d_bda1_91f2bde393ba.slice/crio-42e32d040d711d1333f5057d9aeea7c5437b913ea1d236e881fae1d6f4193b8a WatchSource:0}: Error finding container 42e32d040d711d1333f5057d9aeea7c5437b913ea1d236e881fae1d6f4193b8a: Status 404 returned error can't find the container with id 42e32d040d711d1333f5057d9aeea7c5437b913ea1d236e881fae1d6f4193b8a Nov 26 15:03:44 crc kubenswrapper[5200]: I1126 15:03:44.979070 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"96f31a1f-212f-4e9d-bda1-91f2bde393ba","Type":"ContainerStarted","Data":"42e32d040d711d1333f5057d9aeea7c5437b913ea1d236e881fae1d6f4193b8a"} Nov 26 15:03:45 crc kubenswrapper[5200]: I1126 15:03:45.986925 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"96f31a1f-212f-4e9d-bda1-91f2bde393ba","Type":"ContainerStarted","Data":"9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb"} Nov 26 15:03:45 crc kubenswrapper[5200]: I1126 15:03:45.987473 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:46 crc kubenswrapper[5200]: I1126 15:03:46.015003 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.014982085 podStartE2EDuration="2.014982085s" podCreationTimestamp="2025-11-26 15:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:46.003244665 +0000 UTC m=+5594.682446493" watchObservedRunningTime="2025-11-26 15:03:46.014982085 +0000 UTC m=+5594.694183903" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.054575 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.646910 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4p82p"] Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.653182 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.656298 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-manage-scripts\"" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.661013 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-manage-config-data\"" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.683265 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4p82p"] Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.785806 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.785876 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-scripts\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.785936 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-config-data\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.785984 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zlcb\" (UniqueName: \"kubernetes.io/projected/c36d893d-1655-412a-ae4a-9ed40c6f68b2-kube-api-access-6zlcb\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.828553 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.841205 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.843788 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.860304 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.878827 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.888083 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zlcb\" (UniqueName: \"kubernetes.io/projected/c36d893d-1655-412a-ae4a-9ed40c6f68b2-kube-api-access-6zlcb\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.888216 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.888256 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-scripts\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.888311 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-config-data\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.893817 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.897648 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.913108 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.917413 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-config-data\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.923743 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-scripts\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.924090 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zlcb\" (UniqueName: \"kubernetes.io/projected/c36d893d-1655-412a-ae4a-9ed40c6f68b2-kube-api-access-6zlcb\") pod \"nova-cell0-cell-mapping-4p82p\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.938574 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.991106 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-config-data\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.991161 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxrrf\" (UniqueName: \"kubernetes.io/projected/9d24e697-f760-49e9-b4bc-362786bb83bc-kube-api-access-wxrrf\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.991207 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcc5l\" (UniqueName: \"kubernetes.io/projected/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-kube-api-access-wcc5l\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.991236 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-config-data\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.991255 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.991271 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.991329 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-logs\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.991359 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d24e697-f760-49e9-b4bc-362786bb83bc-logs\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:52 crc kubenswrapper[5200]: I1126 15:03:52.993246 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.014792 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.030899 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.034432 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.035802 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.056112 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.077395 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-novncproxy-config-data\"" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.097708 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.102219 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-config-data\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.102385 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxrrf\" (UniqueName: \"kubernetes.io/projected/9d24e697-f760-49e9-b4bc-362786bb83bc-kube-api-access-wxrrf\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.102913 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcc5l\" (UniqueName: \"kubernetes.io/projected/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-kube-api-access-wcc5l\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.103017 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-config-data\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.103044 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.103251 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.103458 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-logs\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.113969 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-logs\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.120831 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d24e697-f760-49e9-b4bc-362786bb83bc-logs\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.146260 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-config-data\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.147201 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.150596 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d24e697-f760-49e9-b4bc-362786bb83bc-logs\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.175222 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-config-data\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.175398 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.178341 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.242259 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcc5l\" (UniqueName: \"kubernetes.io/projected/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-kube-api-access-wcc5l\") pod \"nova-metadata-0\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.249389 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.249458 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.249519 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzsn5\" (UniqueName: \"kubernetes.io/projected/a757b0bc-87fc-439b-bbd5-3e0aea855eff-kube-api-access-qzsn5\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.249566 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.249593 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4n5f\" (UniqueName: \"kubernetes.io/projected/ff3b780a-3460-455b-9bdc-070f1c747d7a-kube-api-access-z4n5f\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.249652 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-config-data\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.250419 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxrrf\" (UniqueName: \"kubernetes.io/projected/9d24e697-f760-49e9-b4bc-362786bb83bc-kube-api-access-wxrrf\") pod \"nova-api-0\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.268408 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b67fff77-czhrj"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.286703 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b67fff77-czhrj"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.286910 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.317891 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.352877 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.352976 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.353075 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzsn5\" (UniqueName: \"kubernetes.io/projected/a757b0bc-87fc-439b-bbd5-3e0aea855eff-kube-api-access-qzsn5\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.353128 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.353183 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4n5f\" (UniqueName: \"kubernetes.io/projected/ff3b780a-3460-455b-9bdc-070f1c747d7a-kube-api-access-z4n5f\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.353269 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-config-data\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.358543 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-config-data\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.359396 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.360498 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.382816 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.383366 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4n5f\" (UniqueName: \"kubernetes.io/projected/ff3b780a-3460-455b-9bdc-070f1c747d7a-kube-api-access-z4n5f\") pod \"nova-cell1-novncproxy-0\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.383917 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzsn5\" (UniqueName: \"kubernetes.io/projected/a757b0bc-87fc-439b-bbd5-3e0aea855eff-kube-api-access-qzsn5\") pod \"nova-scheduler-0\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.459300 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-nb\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.459588 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-sb\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.459754 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkzv\" (UniqueName: \"kubernetes.io/projected/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-kube-api-access-sbkzv\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.459846 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-dns-svc\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.459947 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-config\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.475326 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.478884 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.548128 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.561731 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-nb\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.561802 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-sb\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.561842 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkzv\" (UniqueName: \"kubernetes.io/projected/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-kube-api-access-sbkzv\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.561869 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-dns-svc\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.561934 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-config\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.563258 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-config\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.563633 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-nb\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.563764 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-dns-svc\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.563894 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-sb\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.587344 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkzv\" (UniqueName: \"kubernetes.io/projected/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-kube-api-access-sbkzv\") pod \"dnsmasq-dns-6b67fff77-czhrj\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.594254 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4p82p"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.614131 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.760616 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k55nm"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.774737 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.782049 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-config-data\"" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.789881 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-scripts\"" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.801733 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k55nm"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.878039 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-config-data\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.882110 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkpcl\" (UniqueName: \"kubernetes.io/projected/3b6498da-7f02-4d36-89d3-7ec2246a67aa-kube-api-access-nkpcl\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.882347 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-scripts\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.882450 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.928990 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.984123 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-scripts\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.984163 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.984232 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-config-data\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.984367 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkpcl\" (UniqueName: \"kubernetes.io/projected/3b6498da-7f02-4d36-89d3-7ec2246a67aa-kube-api-access-nkpcl\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:53 crc kubenswrapper[5200]: I1126 15:03:53.993907 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-config-data\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.009560 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.019190 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkpcl\" (UniqueName: \"kubernetes.io/projected/3b6498da-7f02-4d36-89d3-7ec2246a67aa-kube-api-access-nkpcl\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.024003 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-scripts\") pod \"nova-cell1-conductor-db-sync-k55nm\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.149239 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:03:54 crc kubenswrapper[5200]: W1126 15:03:54.165093 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda757b0bc_87fc_439b_bbd5_3e0aea855eff.slice/crio-fadcb2a0140c57aa37aa617cbec933bcbf35d7c0fb90b25bcfeb6f5b83723e3b WatchSource:0}: Error finding container fadcb2a0140c57aa37aa617cbec933bcbf35d7c0fb90b25bcfeb6f5b83723e3b: Status 404 returned error can't find the container with id fadcb2a0140c57aa37aa617cbec933bcbf35d7c0fb90b25bcfeb6f5b83723e3b Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.190039 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.194551 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:03:54 crc kubenswrapper[5200]: W1126 15:03:54.215202 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d24e697_f760_49e9_b4bc_362786bb83bc.slice/crio-ec86fc1ab0d85e474da177aeb83b2fc6f6c0a302ef4ec0b1667222bf7b13cd73 WatchSource:0}: Error finding container ec86fc1ab0d85e474da177aeb83b2fc6f6c0a302ef4ec0b1667222bf7b13cd73: Status 404 returned error can't find the container with id ec86fc1ab0d85e474da177aeb83b2fc6f6c0a302ef4ec0b1667222bf7b13cd73 Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.300784 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d24e697-f760-49e9-b4bc-362786bb83bc","Type":"ContainerStarted","Data":"ec86fc1ab0d85e474da177aeb83b2fc6f6c0a302ef4ec0b1667222bf7b13cd73"} Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.325340 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"684fd507-f2e3-4ab6-9313-89a84aa2a8fe","Type":"ContainerStarted","Data":"30e5e1d8f8113d116499068d4773df379a36cea93f5ab2dfa47fb58b3206780c"} Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.331157 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4p82p" event={"ID":"c36d893d-1655-412a-ae4a-9ed40c6f68b2","Type":"ContainerStarted","Data":"36d7c7de230f9d58d22c081caa3e1f13eb945794c41d84c9dc8e664a97ef10c9"} Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.335037 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a757b0bc-87fc-439b-bbd5-3e0aea855eff","Type":"ContainerStarted","Data":"fadcb2a0140c57aa37aa617cbec933bcbf35d7c0fb90b25bcfeb6f5b83723e3b"} Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.357331 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4p82p" podStartSLOduration=2.3573031110000002 podStartE2EDuration="2.357303111s" podCreationTimestamp="2025-11-26 15:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:54.349295929 +0000 UTC m=+5603.028497757" watchObservedRunningTime="2025-11-26 15:03:54.357303111 +0000 UTC m=+5603.036504929" Nov 26 15:03:54 crc kubenswrapper[5200]: W1126 15:03:54.413592 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff3b780a_3460_455b_9bdc_070f1c747d7a.slice/crio-82b3ff54e4e60a52bca6b62f69d54dbd29c1152200d6da48ab3d21859795df21 WatchSource:0}: Error finding container 82b3ff54e4e60a52bca6b62f69d54dbd29c1152200d6da48ab3d21859795df21: Status 404 returned error can't find the container with id 82b3ff54e4e60a52bca6b62f69d54dbd29c1152200d6da48ab3d21859795df21 Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.425628 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.433987 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b67fff77-czhrj"] Nov 26 15:03:54 crc kubenswrapper[5200]: W1126 15:03:54.471808 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493f1a59_ca3d_4ec9_9626_6b6eeb02dbcf.slice/crio-d8faef96794eecc3ca9ec42d4c92609f27b6c369243ab0d9dbf9060d7854e61f WatchSource:0}: Error finding container d8faef96794eecc3ca9ec42d4c92609f27b6c369243ab0d9dbf9060d7854e61f: Status 404 returned error can't find the container with id d8faef96794eecc3ca9ec42d4c92609f27b6c369243ab0d9dbf9060d7854e61f Nov 26 15:03:54 crc kubenswrapper[5200]: I1126 15:03:54.777730 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k55nm"] Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.352379 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ff3b780a-3460-455b-9bdc-070f1c747d7a","Type":"ContainerStarted","Data":"ea853a65412e0624d2988d49e4cdecbe87ad0112d998218517e34f11724c848b"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.353384 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ff3b780a-3460-455b-9bdc-070f1c747d7a","Type":"ContainerStarted","Data":"82b3ff54e4e60a52bca6b62f69d54dbd29c1152200d6da48ab3d21859795df21"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.360967 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"684fd507-f2e3-4ab6-9313-89a84aa2a8fe","Type":"ContainerStarted","Data":"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.361282 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"684fd507-f2e3-4ab6-9313-89a84aa2a8fe","Type":"ContainerStarted","Data":"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.366417 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4p82p" event={"ID":"c36d893d-1655-412a-ae4a-9ed40c6f68b2","Type":"ContainerStarted","Data":"529a846ffe4e6f6be4eb4bedce481a90e939c3666723c3cb5b69d125f1c56c15"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.381036 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a757b0bc-87fc-439b-bbd5-3e0aea855eff","Type":"ContainerStarted","Data":"d0eafa9dde6fa57957feab076a6f2acfe739b429e4158b21749fd8a32a3f64e3"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.385414 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d24e697-f760-49e9-b4bc-362786bb83bc","Type":"ContainerStarted","Data":"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.385634 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d24e697-f760-49e9-b4bc-362786bb83bc","Type":"ContainerStarted","Data":"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.387634 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k55nm" event={"ID":"3b6498da-7f02-4d36-89d3-7ec2246a67aa","Type":"ContainerStarted","Data":"e8422a1ad238a2d8574b310c987bc9355d98c86a1506f9ffc5a57f15ddd04d3a"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.387793 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k55nm" event={"ID":"3b6498da-7f02-4d36-89d3-7ec2246a67aa","Type":"ContainerStarted","Data":"55dfe1488fbf01fdcff42df5ede6e128e15b53cc605c66063adb88a2a88b7641"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.390198 5200 generic.go:358] "Generic (PLEG): container finished" podID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerID="aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2" exitCode=0 Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.390384 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" event={"ID":"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf","Type":"ContainerDied","Data":"aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.390460 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" event={"ID":"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf","Type":"ContainerStarted","Data":"d8faef96794eecc3ca9ec42d4c92609f27b6c369243ab0d9dbf9060d7854e61f"} Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.392995 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.392961999 podStartE2EDuration="3.392961999s" podCreationTimestamp="2025-11-26 15:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:55.377806009 +0000 UTC m=+5604.057007827" watchObservedRunningTime="2025-11-26 15:03:55.392961999 +0000 UTC m=+5604.072163827" Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.413271 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.413245225 podStartE2EDuration="3.413245225s" podCreationTimestamp="2025-11-26 15:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:55.400466918 +0000 UTC m=+5604.079668736" watchObservedRunningTime="2025-11-26 15:03:55.413245225 +0000 UTC m=+5604.092447043" Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.512922 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.512896965 podStartE2EDuration="3.512896965s" podCreationTimestamp="2025-11-26 15:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:55.444095259 +0000 UTC m=+5604.123297087" watchObservedRunningTime="2025-11-26 15:03:55.512896965 +0000 UTC m=+5604.192098783" Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.533444 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.5334195470000003 podStartE2EDuration="3.533419547s" podCreationTimestamp="2025-11-26 15:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:55.463160263 +0000 UTC m=+5604.142362071" watchObservedRunningTime="2025-11-26 15:03:55.533419547 +0000 UTC m=+5604.212621385" Nov 26 15:03:55 crc kubenswrapper[5200]: I1126 15:03:55.571127 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-k55nm" podStartSLOduration=2.571099082 podStartE2EDuration="2.571099082s" podCreationTimestamp="2025-11-26 15:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:55.489659152 +0000 UTC m=+5604.168860970" watchObservedRunningTime="2025-11-26 15:03:55.571099082 +0000 UTC m=+5604.250300900" Nov 26 15:03:56 crc kubenswrapper[5200]: I1126 15:03:56.410395 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" event={"ID":"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf","Type":"ContainerStarted","Data":"f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3"} Nov 26 15:03:56 crc kubenswrapper[5200]: I1126 15:03:56.411867 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:03:56 crc kubenswrapper[5200]: I1126 15:03:56.453798 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" podStartSLOduration=3.453776601 podStartE2EDuration="3.453776601s" podCreationTimestamp="2025-11-26 15:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:56.445471152 +0000 UTC m=+5605.124672980" watchObservedRunningTime="2025-11-26 15:03:56.453776601 +0000 UTC m=+5605.132978419" Nov 26 15:03:58 crc kubenswrapper[5200]: I1126 15:03:58.319724 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 15:03:58 crc kubenswrapper[5200]: I1126 15:03:58.321148 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 15:03:58 crc kubenswrapper[5200]: I1126 15:03:58.447219 5200 generic.go:358] "Generic (PLEG): container finished" podID="3b6498da-7f02-4d36-89d3-7ec2246a67aa" containerID="e8422a1ad238a2d8574b310c987bc9355d98c86a1506f9ffc5a57f15ddd04d3a" exitCode=0 Nov 26 15:03:58 crc kubenswrapper[5200]: I1126 15:03:58.447288 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k55nm" event={"ID":"3b6498da-7f02-4d36-89d3-7ec2246a67aa","Type":"ContainerDied","Data":"e8422a1ad238a2d8574b310c987bc9355d98c86a1506f9ffc5a57f15ddd04d3a"} Nov 26 15:03:58 crc kubenswrapper[5200]: I1126 15:03:58.476847 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Nov 26 15:03:58 crc kubenswrapper[5200]: I1126 15:03:58.549797 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.460487 5200 generic.go:358] "Generic (PLEG): container finished" podID="c36d893d-1655-412a-ae4a-9ed40c6f68b2" containerID="529a846ffe4e6f6be4eb4bedce481a90e939c3666723c3cb5b69d125f1c56c15" exitCode=0 Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.460587 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4p82p" event={"ID":"c36d893d-1655-412a-ae4a-9ed40c6f68b2","Type":"ContainerDied","Data":"529a846ffe4e6f6be4eb4bedce481a90e939c3666723c3cb5b69d125f1c56c15"} Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.847579 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.912880 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-scripts\") pod \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.913060 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkpcl\" (UniqueName: \"kubernetes.io/projected/3b6498da-7f02-4d36-89d3-7ec2246a67aa-kube-api-access-nkpcl\") pod \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.913222 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-combined-ca-bundle\") pod \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.918628 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-scripts" (OuterVolumeSpecName: "scripts") pod "3b6498da-7f02-4d36-89d3-7ec2246a67aa" (UID: "3b6498da-7f02-4d36-89d3-7ec2246a67aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.922867 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b6498da-7f02-4d36-89d3-7ec2246a67aa-kube-api-access-nkpcl" (OuterVolumeSpecName: "kube-api-access-nkpcl") pod "3b6498da-7f02-4d36-89d3-7ec2246a67aa" (UID: "3b6498da-7f02-4d36-89d3-7ec2246a67aa"). InnerVolumeSpecName "kube-api-access-nkpcl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:03:59 crc kubenswrapper[5200]: I1126 15:03:59.961617 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b6498da-7f02-4d36-89d3-7ec2246a67aa" (UID: "3b6498da-7f02-4d36-89d3-7ec2246a67aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.015213 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-config-data\") pod \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\" (UID: \"3b6498da-7f02-4d36-89d3-7ec2246a67aa\") " Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.015967 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.015987 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.015997 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkpcl\" (UniqueName: \"kubernetes.io/projected/3b6498da-7f02-4d36-89d3-7ec2246a67aa-kube-api-access-nkpcl\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.043970 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-config-data" (OuterVolumeSpecName: "config-data") pod "3b6498da-7f02-4d36-89d3-7ec2246a67aa" (UID: "3b6498da-7f02-4d36-89d3-7ec2246a67aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.118110 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b6498da-7f02-4d36-89d3-7ec2246a67aa-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.473480 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k55nm" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.473538 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k55nm" event={"ID":"3b6498da-7f02-4d36-89d3-7ec2246a67aa","Type":"ContainerDied","Data":"55dfe1488fbf01fdcff42df5ede6e128e15b53cc605c66063adb88a2a88b7641"} Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.473600 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55dfe1488fbf01fdcff42df5ede6e128e15b53cc605c66063adb88a2a88b7641" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.566013 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.569067 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b6498da-7f02-4d36-89d3-7ec2246a67aa" containerName="nova-cell1-conductor-db-sync" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.569114 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6498da-7f02-4d36-89d3-7ec2246a67aa" containerName="nova-cell1-conductor-db-sync" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.569537 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b6498da-7f02-4d36-89d3-7ec2246a67aa" containerName="nova-cell1-conductor-db-sync" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.581037 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.581189 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.586318 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-config-data\"" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.631793 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.632095 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.632332 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmvd\" (UniqueName: \"kubernetes.io/projected/2c678887-cc1d-4694-9cd6-8fbacccacd07-kube-api-access-vmmvd\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.734071 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.735320 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmvd\" (UniqueName: \"kubernetes.io/projected/2c678887-cc1d-4694-9cd6-8fbacccacd07-kube-api-access-vmmvd\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.735416 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.739622 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.740068 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.756343 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmvd\" (UniqueName: \"kubernetes.io/projected/2c678887-cc1d-4694-9cd6-8fbacccacd07-kube-api-access-vmmvd\") pod \"nova-cell1-conductor-0\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.836231 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.916932 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.938344 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-combined-ca-bundle\") pod \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.938473 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-config-data\") pod \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.938812 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-scripts\") pod \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.938864 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zlcb\" (UniqueName: \"kubernetes.io/projected/c36d893d-1655-412a-ae4a-9ed40c6f68b2-kube-api-access-6zlcb\") pod \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\" (UID: \"c36d893d-1655-412a-ae4a-9ed40c6f68b2\") " Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.942791 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c36d893d-1655-412a-ae4a-9ed40c6f68b2-kube-api-access-6zlcb" (OuterVolumeSpecName: "kube-api-access-6zlcb") pod "c36d893d-1655-412a-ae4a-9ed40c6f68b2" (UID: "c36d893d-1655-412a-ae4a-9ed40c6f68b2"). InnerVolumeSpecName "kube-api-access-6zlcb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.944173 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-scripts" (OuterVolumeSpecName: "scripts") pod "c36d893d-1655-412a-ae4a-9ed40c6f68b2" (UID: "c36d893d-1655-412a-ae4a-9ed40c6f68b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.965541 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-config-data" (OuterVolumeSpecName: "config-data") pod "c36d893d-1655-412a-ae4a-9ed40c6f68b2" (UID: "c36d893d-1655-412a-ae4a-9ed40c6f68b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:00 crc kubenswrapper[5200]: I1126 15:04:00.977997 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c36d893d-1655-412a-ae4a-9ed40c6f68b2" (UID: "c36d893d-1655-412a-ae4a-9ed40c6f68b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.042090 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.042538 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.042550 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36d893d-1655-412a-ae4a-9ed40c6f68b2-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.042561 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zlcb\" (UniqueName: \"kubernetes.io/projected/c36d893d-1655-412a-ae4a-9ed40c6f68b2-kube-api-access-6zlcb\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.356020 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.502076 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4p82p" event={"ID":"c36d893d-1655-412a-ae4a-9ed40c6f68b2","Type":"ContainerDied","Data":"36d7c7de230f9d58d22c081caa3e1f13eb945794c41d84c9dc8e664a97ef10c9"} Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.502130 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36d7c7de230f9d58d22c081caa3e1f13eb945794c41d84c9dc8e664a97ef10c9" Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.502090 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4p82p" Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.505022 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c678887-cc1d-4694-9cd6-8fbacccacd07","Type":"ContainerStarted","Data":"976f85cb87751213d380aede4ddc6583eedef61b2b11ea0b079506b1c75fe3e1"} Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.687224 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.688012 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerName="nova-api-log" containerID="cri-o://08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db" gracePeriod=30 Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.688134 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerName="nova-api-api" containerID="cri-o://63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1" gracePeriod=30 Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.732116 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.732469 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a757b0bc-87fc-439b-bbd5-3e0aea855eff" containerName="nova-scheduler-scheduler" containerID="cri-o://d0eafa9dde6fa57957feab076a6f2acfe739b429e4158b21749fd8a32a3f64e3" gracePeriod=30 Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.749014 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.749338 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerName="nova-metadata-log" containerID="cri-o://a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43" gracePeriod=30 Nov 26 15:04:01 crc kubenswrapper[5200]: I1126 15:04:01.749541 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerName="nova-metadata-metadata" containerID="cri-o://1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f" gracePeriod=30 Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.278708 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.373772 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-config-data\") pod \"9d24e697-f760-49e9-b4bc-362786bb83bc\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.373892 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d24e697-f760-49e9-b4bc-362786bb83bc-logs\") pod \"9d24e697-f760-49e9-b4bc-362786bb83bc\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.373915 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxrrf\" (UniqueName: \"kubernetes.io/projected/9d24e697-f760-49e9-b4bc-362786bb83bc-kube-api-access-wxrrf\") pod \"9d24e697-f760-49e9-b4bc-362786bb83bc\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.374017 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-combined-ca-bundle\") pod \"9d24e697-f760-49e9-b4bc-362786bb83bc\" (UID: \"9d24e697-f760-49e9-b4bc-362786bb83bc\") " Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.374229 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d24e697-f760-49e9-b4bc-362786bb83bc-logs" (OuterVolumeSpecName: "logs") pod "9d24e697-f760-49e9-b4bc-362786bb83bc" (UID: "9d24e697-f760-49e9-b4bc-362786bb83bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.374509 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d24e697-f760-49e9-b4bc-362786bb83bc-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.380091 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.381004 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d24e697-f760-49e9-b4bc-362786bb83bc-kube-api-access-wxrrf" (OuterVolumeSpecName: "kube-api-access-wxrrf") pod "9d24e697-f760-49e9-b4bc-362786bb83bc" (UID: "9d24e697-f760-49e9-b4bc-362786bb83bc"). InnerVolumeSpecName "kube-api-access-wxrrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.413769 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d24e697-f760-49e9-b4bc-362786bb83bc" (UID: "9d24e697-f760-49e9-b4bc-362786bb83bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.422788 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-config-data" (OuterVolumeSpecName: "config-data") pod "9d24e697-f760-49e9-b4bc-362786bb83bc" (UID: "9d24e697-f760-49e9-b4bc-362786bb83bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.475985 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-combined-ca-bundle\") pod \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.476049 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcc5l\" (UniqueName: \"kubernetes.io/projected/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-kube-api-access-wcc5l\") pod \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.476145 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-config-data\") pod \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.476294 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-logs\") pod \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\" (UID: \"684fd507-f2e3-4ab6-9313-89a84aa2a8fe\") " Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.477097 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.477117 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wxrrf\" (UniqueName: \"kubernetes.io/projected/9d24e697-f760-49e9-b4bc-362786bb83bc-kube-api-access-wxrrf\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.477128 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d24e697-f760-49e9-b4bc-362786bb83bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.477362 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-logs" (OuterVolumeSpecName: "logs") pod "684fd507-f2e3-4ab6-9313-89a84aa2a8fe" (UID: "684fd507-f2e3-4ab6-9313-89a84aa2a8fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.481046 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-kube-api-access-wcc5l" (OuterVolumeSpecName: "kube-api-access-wcc5l") pod "684fd507-f2e3-4ab6-9313-89a84aa2a8fe" (UID: "684fd507-f2e3-4ab6-9313-89a84aa2a8fe"). InnerVolumeSpecName "kube-api-access-wcc5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.519151 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "684fd507-f2e3-4ab6-9313-89a84aa2a8fe" (UID: "684fd507-f2e3-4ab6-9313-89a84aa2a8fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.525139 5200 generic.go:358] "Generic (PLEG): container finished" podID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerID="1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f" exitCode=0 Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.525170 5200 generic.go:358] "Generic (PLEG): container finished" podID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerID="a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43" exitCode=143 Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.525210 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.525244 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"684fd507-f2e3-4ab6-9313-89a84aa2a8fe","Type":"ContainerDied","Data":"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f"} Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.525310 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"684fd507-f2e3-4ab6-9313-89a84aa2a8fe","Type":"ContainerDied","Data":"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43"} Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.525326 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"684fd507-f2e3-4ab6-9313-89a84aa2a8fe","Type":"ContainerDied","Data":"30e5e1d8f8113d116499068d4773df379a36cea93f5ab2dfa47fb58b3206780c"} Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.525349 5200 scope.go:117] "RemoveContainer" containerID="1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.525942 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-config-data" (OuterVolumeSpecName: "config-data") pod "684fd507-f2e3-4ab6-9313-89a84aa2a8fe" (UID: "684fd507-f2e3-4ab6-9313-89a84aa2a8fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.528288 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c678887-cc1d-4694-9cd6-8fbacccacd07","Type":"ContainerStarted","Data":"c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f"} Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.529009 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.531303 5200 generic.go:358] "Generic (PLEG): container finished" podID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerID="63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1" exitCode=0 Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.531329 5200 generic.go:358] "Generic (PLEG): container finished" podID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerID="08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db" exitCode=143 Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.531503 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d24e697-f760-49e9-b4bc-362786bb83bc","Type":"ContainerDied","Data":"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1"} Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.531529 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d24e697-f760-49e9-b4bc-362786bb83bc","Type":"ContainerDied","Data":"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db"} Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.531541 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d24e697-f760-49e9-b4bc-362786bb83bc","Type":"ContainerDied","Data":"ec86fc1ab0d85e474da177aeb83b2fc6f6c0a302ef4ec0b1667222bf7b13cd73"} Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.531598 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.549966 5200 scope.go:117] "RemoveContainer" containerID="a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.553614 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.553586101 podStartE2EDuration="2.553586101s" podCreationTimestamp="2025-11-26 15:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:02.546334459 +0000 UTC m=+5611.225536287" watchObservedRunningTime="2025-11-26 15:04:02.553586101 +0000 UTC m=+5611.232787919" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.580088 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.580120 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wcc5l\" (UniqueName: \"kubernetes.io/projected/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-kube-api-access-wcc5l\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.580128 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.580138 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684fd507-f2e3-4ab6-9313-89a84aa2a8fe-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.580162 5200 scope.go:117] "RemoveContainer" containerID="1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f" Nov 26 15:04:02 crc kubenswrapper[5200]: E1126 15:04:02.583558 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f\": container with ID starting with 1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f not found: ID does not exist" containerID="1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.583616 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f"} err="failed to get container status \"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f\": rpc error: code = NotFound desc = could not find container \"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f\": container with ID starting with 1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f not found: ID does not exist" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.583646 5200 scope.go:117] "RemoveContainer" containerID="a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43" Nov 26 15:04:02 crc kubenswrapper[5200]: E1126 15:04:02.586616 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43\": container with ID starting with a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43 not found: ID does not exist" containerID="a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.586699 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43"} err="failed to get container status \"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43\": rpc error: code = NotFound desc = could not find container \"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43\": container with ID starting with a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43 not found: ID does not exist" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.586764 5200 scope.go:117] "RemoveContainer" containerID="1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.587656 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f"} err="failed to get container status \"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f\": rpc error: code = NotFound desc = could not find container \"1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f\": container with ID starting with 1a8b10053559c89cbfcbb6b21b5df4353da6e15c431ab915bc04d3c7fdb3286f not found: ID does not exist" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.587691 5200 scope.go:117] "RemoveContainer" containerID="a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.587977 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43"} err="failed to get container status \"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43\": rpc error: code = NotFound desc = could not find container \"a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43\": container with ID starting with a427db40977200aef42ccca8545333734070c1d4d6d2fb7d54c0941cd5cabd43 not found: ID does not exist" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.588013 5200 scope.go:117] "RemoveContainer" containerID="63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.600277 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.629486 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.641918 5200 scope.go:117] "RemoveContainer" containerID="08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.649085 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650644 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerName="nova-api-api" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650671 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerName="nova-api-api" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650712 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c36d893d-1655-412a-ae4a-9ed40c6f68b2" containerName="nova-manage" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650774 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36d893d-1655-412a-ae4a-9ed40c6f68b2" containerName="nova-manage" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650809 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerName="nova-api-log" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650817 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerName="nova-api-log" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650829 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerName="nova-metadata-log" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650838 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerName="nova-metadata-log" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650845 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerName="nova-metadata-metadata" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.650852 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerName="nova-metadata-metadata" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.651099 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerName="nova-metadata-log" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.651119 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerName="nova-api-api" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.651138 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c36d893d-1655-412a-ae4a-9ed40c6f68b2" containerName="nova-manage" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.651158 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" containerName="nova-api-log" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.651172 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" containerName="nova-metadata-metadata" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.669025 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.669246 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.675523 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.697750 5200 scope.go:117] "RemoveContainer" containerID="63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1" Nov 26 15:04:02 crc kubenswrapper[5200]: E1126 15:04:02.698430 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1\": container with ID starting with 63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1 not found: ID does not exist" containerID="63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.698461 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1"} err="failed to get container status \"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1\": rpc error: code = NotFound desc = could not find container \"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1\": container with ID starting with 63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1 not found: ID does not exist" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.698483 5200 scope.go:117] "RemoveContainer" containerID="08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db" Nov 26 15:04:02 crc kubenswrapper[5200]: E1126 15:04:02.698876 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db\": container with ID starting with 08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db not found: ID does not exist" containerID="08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.698922 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db"} err="failed to get container status \"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db\": rpc error: code = NotFound desc = could not find container \"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db\": container with ID starting with 08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db not found: ID does not exist" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.698955 5200 scope.go:117] "RemoveContainer" containerID="63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.699403 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1"} err="failed to get container status \"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1\": rpc error: code = NotFound desc = could not find container \"63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1\": container with ID starting with 63397ca173ad37399b5cab2f73962993f0fbb6280918a26c9daf9b5dbf7082a1 not found: ID does not exist" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.699463 5200 scope.go:117] "RemoveContainer" containerID="08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.699845 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db"} err="failed to get container status \"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db\": rpc error: code = NotFound desc = could not find container \"08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db\": container with ID starting with 08e95123ffba060f145c3bb4ba464a5a73d8690fb99ef27c0f7b737bf0edf3db not found: ID does not exist" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.783458 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbbs\" (UniqueName: \"kubernetes.io/projected/27607b46-907c-45e4-90e6-34fd3829deb7-kube-api-access-jtbbs\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.783593 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27607b46-907c-45e4-90e6-34fd3829deb7-logs\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.783932 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-config-data\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.784201 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.869773 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.882438 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.885919 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.887122 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbbs\" (UniqueName: \"kubernetes.io/projected/27607b46-907c-45e4-90e6-34fd3829deb7-kube-api-access-jtbbs\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.887299 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27607b46-907c-45e4-90e6-34fd3829deb7-logs\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.887567 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-config-data\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.887829 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27607b46-907c-45e4-90e6-34fd3829deb7-logs\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.892313 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.892582 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-config-data\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.906688 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.916248 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbbs\" (UniqueName: \"kubernetes.io/projected/27607b46-907c-45e4-90e6-34fd3829deb7-kube-api-access-jtbbs\") pod \"nova-api-0\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " pod="openstack/nova-api-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.982363 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.985840 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.988302 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684fd507-f2e3-4ab6-9313-89a84aa2a8fe" path="/var/lib/kubelet/pods/684fd507-f2e3-4ab6-9313-89a84aa2a8fe/volumes" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.989109 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d24e697-f760-49e9-b4bc-362786bb83bc" path="/var/lib/kubelet/pods/9d24e697-f760-49e9-b4bc-362786bb83bc/volumes" Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.989711 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:02 crc kubenswrapper[5200]: I1126 15:04:02.995783 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.093820 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-config-data\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.094142 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.094255 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4dn\" (UniqueName: \"kubernetes.io/projected/5b94d901-9543-463b-94ad-5dae6b7e9c98-kube-api-access-9c4dn\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.094534 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b94d901-9543-463b-94ad-5dae6b7e9c98-logs\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.197196 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4dn\" (UniqueName: \"kubernetes.io/projected/5b94d901-9543-463b-94ad-5dae6b7e9c98-kube-api-access-9c4dn\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.197592 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b94d901-9543-463b-94ad-5dae6b7e9c98-logs\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.198072 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b94d901-9543-463b-94ad-5dae6b7e9c98-logs\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.198248 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-config-data\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.198319 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.205492 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-config-data\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.207060 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.215753 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4dn\" (UniqueName: \"kubernetes.io/projected/5b94d901-9543-463b-94ad-5dae6b7e9c98-kube-api-access-9c4dn\") pod \"nova-metadata-0\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.305462 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.450017 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.478745 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.536925 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d6d878d7c-542xd"] Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.537192 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" podUID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" containerName="dnsmasq-dns" containerID="cri-o://150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61" gracePeriod=10 Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.549395 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.554067 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27607b46-907c-45e4-90e6-34fd3829deb7","Type":"ContainerStarted","Data":"780dd00a01ccac9ed3a905b27687ad6ef4ee12d7526404f142d84ba909b0d938"} Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.595225 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:04:03 crc kubenswrapper[5200]: I1126 15:04:03.835858 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.047654 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.128481 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-sb\") pod \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.128553 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-nb\") pod \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.128681 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-dns-svc\") pod \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.128790 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsc8s\" (UniqueName: \"kubernetes.io/projected/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-kube-api-access-tsc8s\") pod \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.128896 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-config\") pod \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\" (UID: \"b9ca1ad7-10a2-42a3-8f79-e36941e707d6\") " Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.136784 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-kube-api-access-tsc8s" (OuterVolumeSpecName: "kube-api-access-tsc8s") pod "b9ca1ad7-10a2-42a3-8f79-e36941e707d6" (UID: "b9ca1ad7-10a2-42a3-8f79-e36941e707d6"). InnerVolumeSpecName "kube-api-access-tsc8s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.183327 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b9ca1ad7-10a2-42a3-8f79-e36941e707d6" (UID: "b9ca1ad7-10a2-42a3-8f79-e36941e707d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.197247 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-config" (OuterVolumeSpecName: "config") pod "b9ca1ad7-10a2-42a3-8f79-e36941e707d6" (UID: "b9ca1ad7-10a2-42a3-8f79-e36941e707d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.211885 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9ca1ad7-10a2-42a3-8f79-e36941e707d6" (UID: "b9ca1ad7-10a2-42a3-8f79-e36941e707d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.228087 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b9ca1ad7-10a2-42a3-8f79-e36941e707d6" (UID: "b9ca1ad7-10a2-42a3-8f79-e36941e707d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.231521 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.231555 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.231568 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tsc8s\" (UniqueName: \"kubernetes.io/projected/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-kube-api-access-tsc8s\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.231583 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.231594 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ca1ad7-10a2-42a3-8f79-e36941e707d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.567629 5200 generic.go:358] "Generic (PLEG): container finished" podID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" containerID="150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61" exitCode=0 Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.567867 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" event={"ID":"b9ca1ad7-10a2-42a3-8f79-e36941e707d6","Type":"ContainerDied","Data":"150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61"} Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.567901 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" event={"ID":"b9ca1ad7-10a2-42a3-8f79-e36941e707d6","Type":"ContainerDied","Data":"92372470445ddcf2c0bb9eaf02d0eac0a456115ad2586601d69a437f805cdb26"} Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.567923 5200 scope.go:117] "RemoveContainer" containerID="150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.568149 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6d878d7c-542xd" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.577161 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b94d901-9543-463b-94ad-5dae6b7e9c98","Type":"ContainerStarted","Data":"b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823"} Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.577219 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b94d901-9543-463b-94ad-5dae6b7e9c98","Type":"ContainerStarted","Data":"32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b"} Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.577234 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b94d901-9543-463b-94ad-5dae6b7e9c98","Type":"ContainerStarted","Data":"191f95e4523c7d7dac47b1a38f4d60c21a8ed6c8dc56f1ad9ca120e3faaaa5e6"} Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.582203 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27607b46-907c-45e4-90e6-34fd3829deb7","Type":"ContainerStarted","Data":"112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c"} Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.582234 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27607b46-907c-45e4-90e6-34fd3829deb7","Type":"ContainerStarted","Data":"ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5"} Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.592030 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.601963 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.601940791 podStartE2EDuration="2.601940791s" podCreationTimestamp="2025-11-26 15:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:04.600916434 +0000 UTC m=+5613.280118272" watchObservedRunningTime="2025-11-26 15:04:04.601940791 +0000 UTC m=+5613.281142609" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.610981 5200 scope.go:117] "RemoveContainer" containerID="30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.632897 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d6d878d7c-542xd"] Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.649453 5200 scope.go:117] "RemoveContainer" containerID="150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61" Nov 26 15:04:04 crc kubenswrapper[5200]: E1126 15:04:04.649988 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61\": container with ID starting with 150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61 not found: ID does not exist" containerID="150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.650039 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61"} err="failed to get container status \"150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61\": rpc error: code = NotFound desc = could not find container \"150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61\": container with ID starting with 150c7ff541fc50ff193a4257f94be08bd1ea0e4621fc624836ffd6abf254bb61 not found: ID does not exist" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.650067 5200 scope.go:117] "RemoveContainer" containerID="30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937" Nov 26 15:04:04 crc kubenswrapper[5200]: E1126 15:04:04.650388 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937\": container with ID starting with 30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937 not found: ID does not exist" containerID="30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.650438 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937"} err="failed to get container status \"30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937\": rpc error: code = NotFound desc = could not find container \"30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937\": container with ID starting with 30a87396c2a41245babf5bd3e6644d6d91d6c042435e8275ba3a47de29008937 not found: ID does not exist" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.654794 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d6d878d7c-542xd"] Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.670088 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6700663799999997 podStartE2EDuration="2.67006638s" podCreationTimestamp="2025-11-26 15:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:04.654082018 +0000 UTC m=+5613.333283846" watchObservedRunningTime="2025-11-26 15:04:04.67006638 +0000 UTC m=+5613.349268228" Nov 26 15:04:04 crc kubenswrapper[5200]: I1126 15:04:04.982126 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" path="/var/lib/kubelet/pods/b9ca1ad7-10a2-42a3-8f79-e36941e707d6/volumes" Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.622600 5200 generic.go:358] "Generic (PLEG): container finished" podID="a757b0bc-87fc-439b-bbd5-3e0aea855eff" containerID="d0eafa9dde6fa57957feab076a6f2acfe739b429e4158b21749fd8a32a3f64e3" exitCode=0 Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.622891 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a757b0bc-87fc-439b-bbd5-3e0aea855eff","Type":"ContainerDied","Data":"d0eafa9dde6fa57957feab076a6f2acfe739b429e4158b21749fd8a32a3f64e3"} Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.623171 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.623176 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a757b0bc-87fc-439b-bbd5-3e0aea855eff","Type":"ContainerDied","Data":"fadcb2a0140c57aa37aa617cbec933bcbf35d7c0fb90b25bcfeb6f5b83723e3b"} Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.623198 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fadcb2a0140c57aa37aa617cbec933bcbf35d7c0fb90b25bcfeb6f5b83723e3b" Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.680420 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-config-data\") pod \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.680585 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzsn5\" (UniqueName: \"kubernetes.io/projected/a757b0bc-87fc-439b-bbd5-3e0aea855eff-kube-api-access-qzsn5\") pod \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.680659 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-combined-ca-bundle\") pod \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\" (UID: \"a757b0bc-87fc-439b-bbd5-3e0aea855eff\") " Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.686951 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a757b0bc-87fc-439b-bbd5-3e0aea855eff-kube-api-access-qzsn5" (OuterVolumeSpecName: "kube-api-access-qzsn5") pod "a757b0bc-87fc-439b-bbd5-3e0aea855eff" (UID: "a757b0bc-87fc-439b-bbd5-3e0aea855eff"). InnerVolumeSpecName "kube-api-access-qzsn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.705475 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a757b0bc-87fc-439b-bbd5-3e0aea855eff" (UID: "a757b0bc-87fc-439b-bbd5-3e0aea855eff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.716046 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-config-data" (OuterVolumeSpecName: "config-data") pod "a757b0bc-87fc-439b-bbd5-3e0aea855eff" (UID: "a757b0bc-87fc-439b-bbd5-3e0aea855eff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.782837 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.782896 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a757b0bc-87fc-439b-bbd5-3e0aea855eff-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:06 crc kubenswrapper[5200]: I1126 15:04:06.782908 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzsn5\" (UniqueName: \"kubernetes.io/projected/a757b0bc-87fc-439b-bbd5-3e0aea855eff-kube-api-access-qzsn5\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.632901 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.657872 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.672974 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.682840 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.684384 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" containerName="dnsmasq-dns" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.684409 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" containerName="dnsmasq-dns" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.684428 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a757b0bc-87fc-439b-bbd5-3e0aea855eff" containerName="nova-scheduler-scheduler" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.684437 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a757b0bc-87fc-439b-bbd5-3e0aea855eff" containerName="nova-scheduler-scheduler" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.684480 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" containerName="init" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.684488 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" containerName="init" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.684748 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a757b0bc-87fc-439b-bbd5-3e0aea855eff" containerName="nova-scheduler-scheduler" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.684771 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9ca1ad7-10a2-42a3-8f79-e36941e707d6" containerName="dnsmasq-dns" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.723748 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.723897 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.726202 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.807018 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8745\" (UniqueName: \"kubernetes.io/projected/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-kube-api-access-x8745\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.807391 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.807534 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-config-data\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.909621 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8745\" (UniqueName: \"kubernetes.io/projected/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-kube-api-access-x8745\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.910021 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.910782 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-config-data\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.914624 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.916415 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-config-data\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:07 crc kubenswrapper[5200]: I1126 15:04:07.941654 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8745\" (UniqueName: \"kubernetes.io/projected/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-kube-api-access-x8745\") pod \"nova-scheduler-0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:08 crc kubenswrapper[5200]: I1126 15:04:08.041510 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:04:08 crc kubenswrapper[5200]: I1126 15:04:08.306321 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 15:04:08 crc kubenswrapper[5200]: I1126 15:04:08.306863 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 15:04:08 crc kubenswrapper[5200]: W1126 15:04:08.565243 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa51376_df9e_4aca_b1d0_dd2bfdd4dbc0.slice/crio-f7ccc5e7141670f9d6c38fc3d4c03bbfac78d046bccf6dbfad4f303a3340c1b9 WatchSource:0}: Error finding container f7ccc5e7141670f9d6c38fc3d4c03bbfac78d046bccf6dbfad4f303a3340c1b9: Status 404 returned error can't find the container with id f7ccc5e7141670f9d6c38fc3d4c03bbfac78d046bccf6dbfad4f303a3340c1b9 Nov 26 15:04:08 crc kubenswrapper[5200]: I1126 15:04:08.568804 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:08 crc kubenswrapper[5200]: I1126 15:04:08.645858 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0","Type":"ContainerStarted","Data":"f7ccc5e7141670f9d6c38fc3d4c03bbfac78d046bccf6dbfad4f303a3340c1b9"} Nov 26 15:04:08 crc kubenswrapper[5200]: I1126 15:04:08.996174 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a757b0bc-87fc-439b-bbd5-3e0aea855eff" path="/var/lib/kubelet/pods/a757b0bc-87fc-439b-bbd5-3e0aea855eff/volumes" Nov 26 15:04:09 crc kubenswrapper[5200]: I1126 15:04:09.613408 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 26 15:04:09 crc kubenswrapper[5200]: I1126 15:04:09.662758 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0","Type":"ContainerStarted","Data":"02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156"} Nov 26 15:04:09 crc kubenswrapper[5200]: I1126 15:04:09.694278 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6942538259999997 podStartE2EDuration="2.694253826s" podCreationTimestamp="2025-11-26 15:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:09.682312471 +0000 UTC m=+5618.361514329" watchObservedRunningTime="2025-11-26 15:04:09.694253826 +0000 UTC m=+5618.373455664" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.190370 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6h7j7"] Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.205554 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6h7j7"] Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.205703 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.208259 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-manage-config-data\"" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.213479 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-manage-scripts\"" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.367030 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9vqz\" (UniqueName: \"kubernetes.io/projected/afc7daf7-6707-493a-b1df-16a392e9b64c-kube-api-access-m9vqz\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.367077 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-scripts\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.367148 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-config-data\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.367172 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.469082 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9vqz\" (UniqueName: \"kubernetes.io/projected/afc7daf7-6707-493a-b1df-16a392e9b64c-kube-api-access-m9vqz\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.469132 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-scripts\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.469202 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-config-data\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.469226 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.475108 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-scripts\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.475461 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-config-data\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.476345 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.486075 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9vqz\" (UniqueName: \"kubernetes.io/projected/afc7daf7-6707-493a-b1df-16a392e9b64c-kube-api-access-m9vqz\") pod \"nova-cell1-cell-mapping-6h7j7\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:10 crc kubenswrapper[5200]: I1126 15:04:10.534741 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:11 crc kubenswrapper[5200]: I1126 15:04:11.039478 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6h7j7"] Nov 26 15:04:11 crc kubenswrapper[5200]: W1126 15:04:11.043564 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafc7daf7_6707_493a_b1df_16a392e9b64c.slice/crio-144d73a59d28f4933e548b140a00031c8f8384009c365f8f0d4570cfc96f93b7 WatchSource:0}: Error finding container 144d73a59d28f4933e548b140a00031c8f8384009c365f8f0d4570cfc96f93b7: Status 404 returned error can't find the container with id 144d73a59d28f4933e548b140a00031c8f8384009c365f8f0d4570cfc96f93b7 Nov 26 15:04:11 crc kubenswrapper[5200]: I1126 15:04:11.713370 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6h7j7" event={"ID":"afc7daf7-6707-493a-b1df-16a392e9b64c","Type":"ContainerStarted","Data":"124598b1ae2b78bad1a659d4876a95df98be647c2c4eacb2583aecbe29e6dfe1"} Nov 26 15:04:11 crc kubenswrapper[5200]: I1126 15:04:11.713826 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6h7j7" event={"ID":"afc7daf7-6707-493a-b1df-16a392e9b64c","Type":"ContainerStarted","Data":"144d73a59d28f4933e548b140a00031c8f8384009c365f8f0d4570cfc96f93b7"} Nov 26 15:04:12 crc kubenswrapper[5200]: I1126 15:04:12.997132 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:04:12 crc kubenswrapper[5200]: I1126 15:04:12.998767 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:04:13 crc kubenswrapper[5200]: I1126 15:04:13.043269 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Nov 26 15:04:13 crc kubenswrapper[5200]: I1126 15:04:13.306300 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:04:13 crc kubenswrapper[5200]: I1126 15:04:13.306835 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:04:14 crc kubenswrapper[5200]: I1126 15:04:14.038963 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:04:14 crc kubenswrapper[5200]: I1126 15:04:14.039856 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:04:14 crc kubenswrapper[5200]: I1126 15:04:14.347913 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.68:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:04:14 crc kubenswrapper[5200]: I1126 15:04:14.348901 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.68:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:04:16 crc kubenswrapper[5200]: I1126 15:04:16.780842 5200 generic.go:358] "Generic (PLEG): container finished" podID="afc7daf7-6707-493a-b1df-16a392e9b64c" containerID="124598b1ae2b78bad1a659d4876a95df98be647c2c4eacb2583aecbe29e6dfe1" exitCode=0 Nov 26 15:04:16 crc kubenswrapper[5200]: I1126 15:04:16.780946 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6h7j7" event={"ID":"afc7daf7-6707-493a-b1df-16a392e9b64c","Type":"ContainerDied","Data":"124598b1ae2b78bad1a659d4876a95df98be647c2c4eacb2583aecbe29e6dfe1"} Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.042870 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.078264 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.188548 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.253216 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-scripts\") pod \"afc7daf7-6707-493a-b1df-16a392e9b64c\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.253361 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-config-data\") pod \"afc7daf7-6707-493a-b1df-16a392e9b64c\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.253516 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-combined-ca-bundle\") pod \"afc7daf7-6707-493a-b1df-16a392e9b64c\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.253652 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9vqz\" (UniqueName: \"kubernetes.io/projected/afc7daf7-6707-493a-b1df-16a392e9b64c-kube-api-access-m9vqz\") pod \"afc7daf7-6707-493a-b1df-16a392e9b64c\" (UID: \"afc7daf7-6707-493a-b1df-16a392e9b64c\") " Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.258936 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afc7daf7-6707-493a-b1df-16a392e9b64c-kube-api-access-m9vqz" (OuterVolumeSpecName: "kube-api-access-m9vqz") pod "afc7daf7-6707-493a-b1df-16a392e9b64c" (UID: "afc7daf7-6707-493a-b1df-16a392e9b64c"). InnerVolumeSpecName "kube-api-access-m9vqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.259079 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-scripts" (OuterVolumeSpecName: "scripts") pod "afc7daf7-6707-493a-b1df-16a392e9b64c" (UID: "afc7daf7-6707-493a-b1df-16a392e9b64c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.287920 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-config-data" (OuterVolumeSpecName: "config-data") pod "afc7daf7-6707-493a-b1df-16a392e9b64c" (UID: "afc7daf7-6707-493a-b1df-16a392e9b64c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.289304 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afc7daf7-6707-493a-b1df-16a392e9b64c" (UID: "afc7daf7-6707-493a-b1df-16a392e9b64c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.356051 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.356095 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.356111 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc7daf7-6707-493a-b1df-16a392e9b64c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.356126 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m9vqz\" (UniqueName: \"kubernetes.io/projected/afc7daf7-6707-493a-b1df-16a392e9b64c-kube-api-access-m9vqz\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.809090 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6h7j7" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.809159 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6h7j7" event={"ID":"afc7daf7-6707-493a-b1df-16a392e9b64c","Type":"ContainerDied","Data":"144d73a59d28f4933e548b140a00031c8f8384009c365f8f0d4570cfc96f93b7"} Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.809272 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="144d73a59d28f4933e548b140a00031c8f8384009c365f8f0d4570cfc96f93b7" Nov 26 15:04:18 crc kubenswrapper[5200]: I1126 15:04:18.853742 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 15:04:19 crc kubenswrapper[5200]: E1126 15:04:19.013033 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafc7daf7_6707_493a_b1df_16a392e9b64c.slice\": RecentStats: unable to find data in memory cache]" Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.103135 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.105025 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-log" containerID="cri-o://ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5" gracePeriod=30 Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.105328 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-api" containerID="cri-o://112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c" gracePeriod=30 Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.117990 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.118258 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-log" containerID="cri-o://32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b" gracePeriod=30 Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.118368 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-metadata" containerID="cri-o://b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823" gracePeriod=30 Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.314590 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.830168 5200 generic.go:358] "Generic (PLEG): container finished" podID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerID="32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b" exitCode=143 Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.830245 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b94d901-9543-463b-94ad-5dae6b7e9c98","Type":"ContainerDied","Data":"32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b"} Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.833749 5200 generic.go:358] "Generic (PLEG): container finished" podID="27607b46-907c-45e4-90e6-34fd3829deb7" containerID="ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5" exitCode=143 Nov 26 15:04:19 crc kubenswrapper[5200]: I1126 15:04:19.833830 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27607b46-907c-45e4-90e6-34fd3829deb7","Type":"ContainerDied","Data":"ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5"} Nov 26 15:04:20 crc kubenswrapper[5200]: I1126 15:04:20.848797 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" containerName="nova-scheduler-scheduler" containerID="cri-o://02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156" gracePeriod=30 Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.782407 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.790281 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.867066 5200 generic.go:358] "Generic (PLEG): container finished" podID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerID="b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823" exitCode=0 Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.867126 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.867188 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b94d901-9543-463b-94ad-5dae6b7e9c98","Type":"ContainerDied","Data":"b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823"} Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.867225 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b94d901-9543-463b-94ad-5dae6b7e9c98","Type":"ContainerDied","Data":"191f95e4523c7d7dac47b1a38f4d60c21a8ed6c8dc56f1ad9ca120e3faaaa5e6"} Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.867243 5200 scope.go:117] "RemoveContainer" containerID="b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.869592 5200 generic.go:358] "Generic (PLEG): container finished" podID="27607b46-907c-45e4-90e6-34fd3829deb7" containerID="112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c" exitCode=0 Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.870049 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27607b46-907c-45e4-90e6-34fd3829deb7","Type":"ContainerDied","Data":"112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c"} Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.870093 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27607b46-907c-45e4-90e6-34fd3829deb7","Type":"ContainerDied","Data":"780dd00a01ccac9ed3a905b27687ad6ef4ee12d7526404f142d84ba909b0d938"} Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.870179 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.888289 5200 scope.go:117] "RemoveContainer" containerID="32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.913241 5200 scope.go:117] "RemoveContainer" containerID="b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823" Nov 26 15:04:22 crc kubenswrapper[5200]: E1126 15:04:22.913612 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823\": container with ID starting with b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823 not found: ID does not exist" containerID="b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.913645 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823"} err="failed to get container status \"b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823\": rpc error: code = NotFound desc = could not find container \"b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823\": container with ID starting with b2cb3fd84c6ce91efd65f3d3294a41f9fbc724652d3324c750f15beedef80823 not found: ID does not exist" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.913666 5200 scope.go:117] "RemoveContainer" containerID="32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b" Nov 26 15:04:22 crc kubenswrapper[5200]: E1126 15:04:22.914041 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b\": container with ID starting with 32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b not found: ID does not exist" containerID="32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.914062 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b"} err="failed to get container status \"32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b\": rpc error: code = NotFound desc = could not find container \"32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b\": container with ID starting with 32985fd6ea2f24ac6e8a5fdb01f6d543a4cdc587ad32b7bde00333e1c00d630b not found: ID does not exist" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.914073 5200 scope.go:117] "RemoveContainer" containerID="112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.935493 5200 scope.go:117] "RemoveContainer" containerID="ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.954574 5200 scope.go:117] "RemoveContainer" containerID="112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c" Nov 26 15:04:22 crc kubenswrapper[5200]: E1126 15:04:22.955047 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c\": container with ID starting with 112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c not found: ID does not exist" containerID="112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.955084 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c"} err="failed to get container status \"112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c\": rpc error: code = NotFound desc = could not find container \"112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c\": container with ID starting with 112e8f5f67ebaef00b7f660827f7b0b1a2cd180190fe1a84a8adf5205e35cf5c not found: ID does not exist" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.955108 5200 scope.go:117] "RemoveContainer" containerID="ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5" Nov 26 15:04:22 crc kubenswrapper[5200]: E1126 15:04:22.955759 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5\": container with ID starting with ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5 not found: ID does not exist" containerID="ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.955872 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5"} err="failed to get container status \"ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5\": rpc error: code = NotFound desc = could not find container \"ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5\": container with ID starting with ba23e4a7e79389a5c3645abaf1c1241f7926c782a437e7309b70217bccc977a5 not found: ID does not exist" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.962716 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-config-data\") pod \"27607b46-907c-45e4-90e6-34fd3829deb7\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.962762 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtbbs\" (UniqueName: \"kubernetes.io/projected/27607b46-907c-45e4-90e6-34fd3829deb7-kube-api-access-jtbbs\") pod \"27607b46-907c-45e4-90e6-34fd3829deb7\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.962785 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-combined-ca-bundle\") pod \"5b94d901-9543-463b-94ad-5dae6b7e9c98\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.962882 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-combined-ca-bundle\") pod \"27607b46-907c-45e4-90e6-34fd3829deb7\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.962902 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-config-data\") pod \"5b94d901-9543-463b-94ad-5dae6b7e9c98\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.962938 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c4dn\" (UniqueName: \"kubernetes.io/projected/5b94d901-9543-463b-94ad-5dae6b7e9c98-kube-api-access-9c4dn\") pod \"5b94d901-9543-463b-94ad-5dae6b7e9c98\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.962978 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27607b46-907c-45e4-90e6-34fd3829deb7-logs\") pod \"27607b46-907c-45e4-90e6-34fd3829deb7\" (UID: \"27607b46-907c-45e4-90e6-34fd3829deb7\") " Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.963009 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b94d901-9543-463b-94ad-5dae6b7e9c98-logs\") pod \"5b94d901-9543-463b-94ad-5dae6b7e9c98\" (UID: \"5b94d901-9543-463b-94ad-5dae6b7e9c98\") " Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.964300 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27607b46-907c-45e4-90e6-34fd3829deb7-logs" (OuterVolumeSpecName: "logs") pod "27607b46-907c-45e4-90e6-34fd3829deb7" (UID: "27607b46-907c-45e4-90e6-34fd3829deb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.965772 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b94d901-9543-463b-94ad-5dae6b7e9c98-logs" (OuterVolumeSpecName: "logs") pod "5b94d901-9543-463b-94ad-5dae6b7e9c98" (UID: "5b94d901-9543-463b-94ad-5dae6b7e9c98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.968964 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b94d901-9543-463b-94ad-5dae6b7e9c98-kube-api-access-9c4dn" (OuterVolumeSpecName: "kube-api-access-9c4dn") pod "5b94d901-9543-463b-94ad-5dae6b7e9c98" (UID: "5b94d901-9543-463b-94ad-5dae6b7e9c98"). InnerVolumeSpecName "kube-api-access-9c4dn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.975070 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27607b46-907c-45e4-90e6-34fd3829deb7-kube-api-access-jtbbs" (OuterVolumeSpecName: "kube-api-access-jtbbs") pod "27607b46-907c-45e4-90e6-34fd3829deb7" (UID: "27607b46-907c-45e4-90e6-34fd3829deb7"). InnerVolumeSpecName "kube-api-access-jtbbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.987544 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b94d901-9543-463b-94ad-5dae6b7e9c98" (UID: "5b94d901-9543-463b-94ad-5dae6b7e9c98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:22 crc kubenswrapper[5200]: I1126 15:04:22.990818 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27607b46-907c-45e4-90e6-34fd3829deb7" (UID: "27607b46-907c-45e4-90e6-34fd3829deb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.001468 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-config-data" (OuterVolumeSpecName: "config-data") pod "27607b46-907c-45e4-90e6-34fd3829deb7" (UID: "27607b46-907c-45e4-90e6-34fd3829deb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.012587 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-config-data" (OuterVolumeSpecName: "config-data") pod "5b94d901-9543-463b-94ad-5dae6b7e9c98" (UID: "5b94d901-9543-463b-94ad-5dae6b7e9c98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.064914 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jtbbs\" (UniqueName: \"kubernetes.io/projected/27607b46-907c-45e4-90e6-34fd3829deb7-kube-api-access-jtbbs\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.064954 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.064964 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.064974 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b94d901-9543-463b-94ad-5dae6b7e9c98-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.064983 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9c4dn\" (UniqueName: \"kubernetes.io/projected/5b94d901-9543-463b-94ad-5dae6b7e9c98-kube-api-access-9c4dn\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.064992 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27607b46-907c-45e4-90e6-34fd3829deb7-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.065000 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b94d901-9543-463b-94ad-5dae6b7e9c98-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.065008 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27607b46-907c-45e4-90e6-34fd3829deb7-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.207741 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.217519 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.229588 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.240321 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.250133 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251643 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afc7daf7-6707-493a-b1df-16a392e9b64c" containerName="nova-manage" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251668 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc7daf7-6707-493a-b1df-16a392e9b64c" containerName="nova-manage" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251708 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-metadata" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251733 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-metadata" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251748 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-api" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251756 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-api" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251788 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-log" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251795 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-log" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251814 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-log" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.251820 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-log" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.252179 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-log" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.252208 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-log" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.252228 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" containerName="nova-api-api" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.252237 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" containerName="nova-metadata-metadata" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.252244 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="afc7daf7-6707-493a-b1df-16a392e9b64c" containerName="nova-manage" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.268508 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.268668 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.270570 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.271211 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.277390 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.279052 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.279121 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.371924 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4t4z\" (UniqueName: \"kubernetes.io/projected/587cd81d-cae9-4cb3-b029-c9513934be26-kube-api-access-n4t4z\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.371985 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587cd81d-cae9-4cb3-b029-c9513934be26-logs\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.372094 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-config-data\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.372123 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0180cef3-1322-4a69-a824-3fa1e2fc4de9-logs\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.372313 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-config-data\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.372369 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p8t8\" (UniqueName: \"kubernetes.io/projected/0180cef3-1322-4a69-a824-3fa1e2fc4de9-kube-api-access-6p8t8\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.372482 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.372528 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.473719 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-config-data\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.474052 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p8t8\" (UniqueName: \"kubernetes.io/projected/0180cef3-1322-4a69-a824-3fa1e2fc4de9-kube-api-access-6p8t8\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.474094 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.474126 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.474188 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4t4z\" (UniqueName: \"kubernetes.io/projected/587cd81d-cae9-4cb3-b029-c9513934be26-kube-api-access-n4t4z\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.474228 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587cd81d-cae9-4cb3-b029-c9513934be26-logs\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.474260 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-config-data\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.474288 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0180cef3-1322-4a69-a824-3fa1e2fc4de9-logs\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.474913 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587cd81d-cae9-4cb3-b029-c9513934be26-logs\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.475033 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0180cef3-1322-4a69-a824-3fa1e2fc4de9-logs\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.478031 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.479122 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.480317 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-config-data\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.481803 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-config-data\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.490040 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4t4z\" (UniqueName: \"kubernetes.io/projected/587cd81d-cae9-4cb3-b029-c9513934be26-kube-api-access-n4t4z\") pod \"nova-metadata-0\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.492611 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p8t8\" (UniqueName: \"kubernetes.io/projected/0180cef3-1322-4a69-a824-3fa1e2fc4de9-kube-api-access-6p8t8\") pod \"nova-api-0\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.597243 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:04:23 crc kubenswrapper[5200]: I1126 15:04:23.613592 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:04:23 crc kubenswrapper[5200]: E1126 15:04:23.814676 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:04:23 crc kubenswrapper[5200]: E1126 15:04:23.818619 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:04:23 crc kubenswrapper[5200]: E1126 15:04:23.821272 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:04:23 crc kubenswrapper[5200]: E1126 15:04:23.821366 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" containerName="nova-scheduler-scheduler" probeResult="unknown" Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.061750 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.147885 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:04:24 crc kubenswrapper[5200]: W1126 15:04:24.151833 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0180cef3_1322_4a69_a824_3fa1e2fc4de9.slice/crio-78cfe7ac0294fa293a975acbe1830d41086248be0401722e0dfead5d10c8f20d WatchSource:0}: Error finding container 78cfe7ac0294fa293a975acbe1830d41086248be0401722e0dfead5d10c8f20d: Status 404 returned error can't find the container with id 78cfe7ac0294fa293a975acbe1830d41086248be0401722e0dfead5d10c8f20d Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.901120 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"587cd81d-cae9-4cb3-b029-c9513934be26","Type":"ContainerStarted","Data":"bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa"} Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.902696 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"587cd81d-cae9-4cb3-b029-c9513934be26","Type":"ContainerStarted","Data":"280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08"} Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.902810 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"587cd81d-cae9-4cb3-b029-c9513934be26","Type":"ContainerStarted","Data":"78aee3a198345618d63855dd670e9e8e532e0a7d5ff049b012d238afab96e74e"} Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.903491 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0180cef3-1322-4a69-a824-3fa1e2fc4de9","Type":"ContainerStarted","Data":"e2f6cf12989b01dc9fb5623c1c74e99f6c35f90c63bb8868807947c16746e8ce"} Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.903516 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0180cef3-1322-4a69-a824-3fa1e2fc4de9","Type":"ContainerStarted","Data":"c3acc206e4f3310bb235932e9d2defbec84e2d81dea1df2b6587f519882225b7"} Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.903525 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0180cef3-1322-4a69-a824-3fa1e2fc4de9","Type":"ContainerStarted","Data":"78cfe7ac0294fa293a975acbe1830d41086248be0401722e0dfead5d10c8f20d"} Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.930359 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9303418479999999 podStartE2EDuration="1.930341848s" podCreationTimestamp="2025-11-26 15:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:24.922281095 +0000 UTC m=+5633.601482933" watchObservedRunningTime="2025-11-26 15:04:24.930341848 +0000 UTC m=+5633.609543666" Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.951382 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.9513606129999999 podStartE2EDuration="1.951360613s" podCreationTimestamp="2025-11-26 15:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:24.94331786 +0000 UTC m=+5633.622519698" watchObservedRunningTime="2025-11-26 15:04:24.951360613 +0000 UTC m=+5633.630562431" Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.979128 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27607b46-907c-45e4-90e6-34fd3829deb7" path="/var/lib/kubelet/pods/27607b46-907c-45e4-90e6-34fd3829deb7/volumes" Nov 26 15:04:24 crc kubenswrapper[5200]: I1126 15:04:24.980450 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b94d901-9543-463b-94ad-5dae6b7e9c98" path="/var/lib/kubelet/pods/5b94d901-9543-463b-94ad-5dae6b7e9c98/volumes" Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.879829 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.922113 5200 generic.go:358] "Generic (PLEG): container finished" podID="1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" containerID="02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156" exitCode=0 Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.923502 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0","Type":"ContainerDied","Data":"02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156"} Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.923562 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0","Type":"ContainerDied","Data":"f7ccc5e7141670f9d6c38fc3d4c03bbfac78d046bccf6dbfad4f303a3340c1b9"} Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.923567 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.923585 5200 scope.go:117] "RemoveContainer" containerID="02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156" Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.925477 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-config-data\") pod \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.925730 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8745\" (UniqueName: \"kubernetes.io/projected/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-kube-api-access-x8745\") pod \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.925918 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-combined-ca-bundle\") pod \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\" (UID: \"1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0\") " Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.933953 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-kube-api-access-x8745" (OuterVolumeSpecName: "kube-api-access-x8745") pod "1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" (UID: "1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0"). InnerVolumeSpecName "kube-api-access-x8745". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.954825 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" (UID: "1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:25 crc kubenswrapper[5200]: I1126 15:04:25.955430 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-config-data" (OuterVolumeSpecName: "config-data") pod "1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" (UID: "1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.025082 5200 scope.go:117] "RemoveContainer" containerID="02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156" Nov 26 15:04:26 crc kubenswrapper[5200]: E1126 15:04:26.025756 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156\": container with ID starting with 02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156 not found: ID does not exist" containerID="02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.025798 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156"} err="failed to get container status \"02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156\": rpc error: code = NotFound desc = could not find container \"02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156\": container with ID starting with 02b088edfacd61c2b1899ada8c47865e75ab7372e5382d712bbfe9dc277c6156 not found: ID does not exist" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.028588 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.028611 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8745\" (UniqueName: \"kubernetes.io/projected/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-kube-api-access-x8745\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.028622 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.281509 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.306839 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.313373 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.314862 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" containerName="nova-scheduler-scheduler" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.314892 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" containerName="nova-scheduler-scheduler" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.315170 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" containerName="nova-scheduler-scheduler" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.322988 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.323798 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.326107 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.436110 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s275x\" (UniqueName: \"kubernetes.io/projected/a235b77d-17ac-4633-9030-940179d31fa9-kube-api-access-s275x\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.436189 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.436414 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-config-data\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.538821 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-config-data\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.538948 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s275x\" (UniqueName: \"kubernetes.io/projected/a235b77d-17ac-4633-9030-940179d31fa9-kube-api-access-s275x\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.539263 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.550990 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-config-data\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.552456 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.559942 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s275x\" (UniqueName: \"kubernetes.io/projected/a235b77d-17ac-4633-9030-940179d31fa9-kube-api-access-s275x\") pod \"nova-scheduler-0\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.640991 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:04:26 crc kubenswrapper[5200]: I1126 15:04:26.982745 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0" path="/var/lib/kubelet/pods/1fa51376-df9e-4aca-b1d0-dd2bfdd4dbc0/volumes" Nov 26 15:04:27 crc kubenswrapper[5200]: I1126 15:04:27.150464 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:04:27 crc kubenswrapper[5200]: I1126 15:04:27.945851 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a235b77d-17ac-4633-9030-940179d31fa9","Type":"ContainerStarted","Data":"4a42c3b223f3a3802d5e2db952fd3d6969fdf5998faac4b40638aa0d5b96d08b"} Nov 26 15:04:27 crc kubenswrapper[5200]: I1126 15:04:27.946214 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a235b77d-17ac-4633-9030-940179d31fa9","Type":"ContainerStarted","Data":"cf69c8719edf5b9189c26e9362487acc277ab83ecbe912d87efa8652d96d21de"} Nov 26 15:04:27 crc kubenswrapper[5200]: I1126 15:04:27.965597 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9655760899999999 podStartE2EDuration="1.96557609s" podCreationTimestamp="2025-11-26 15:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:27.963560807 +0000 UTC m=+5636.642762695" watchObservedRunningTime="2025-11-26 15:04:27.96557609 +0000 UTC m=+5636.644777908" Nov 26 15:04:28 crc kubenswrapper[5200]: I1126 15:04:28.347549 5200 scope.go:117] "RemoveContainer" containerID="74d6c21bb57ef5eaf5ce9e86ea093a5b52f956d560c01f74162995431febd430" Nov 26 15:04:28 crc kubenswrapper[5200]: I1126 15:04:28.598382 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 15:04:28 crc kubenswrapper[5200]: I1126 15:04:28.598457 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 15:04:31 crc kubenswrapper[5200]: I1126 15:04:31.642183 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Nov 26 15:04:33 crc kubenswrapper[5200]: I1126 15:04:33.597773 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:04:33 crc kubenswrapper[5200]: I1126 15:04:33.598163 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:04:33 crc kubenswrapper[5200]: I1126 15:04:33.614552 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:04:33 crc kubenswrapper[5200]: I1126 15:04:33.614659 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:04:34 crc kubenswrapper[5200]: I1126 15:04:34.638997 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:04:34 crc kubenswrapper[5200]: I1126 15:04:34.639022 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:04:34 crc kubenswrapper[5200]: I1126 15:04:34.655928 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:04:34 crc kubenswrapper[5200]: I1126 15:04:34.660854 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:04:36 crc kubenswrapper[5200]: I1126 15:04:36.641759 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 15:04:36 crc kubenswrapper[5200]: I1126 15:04:36.672784 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 15:04:37 crc kubenswrapper[5200]: I1126 15:04:37.074351 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 15:04:42 crc kubenswrapper[5200]: E1126 15:04:42.161168 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1758932 actualBytes=10240 Nov 26 15:04:43 crc kubenswrapper[5200]: I1126 15:04:43.601072 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 15:04:43 crc kubenswrapper[5200]: I1126 15:04:43.601372 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 15:04:43 crc kubenswrapper[5200]: I1126 15:04:43.603455 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 15:04:43 crc kubenswrapper[5200]: I1126 15:04:43.603866 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 15:04:43 crc kubenswrapper[5200]: I1126 15:04:43.619843 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 15:04:43 crc kubenswrapper[5200]: I1126 15:04:43.620181 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 15:04:43 crc kubenswrapper[5200]: I1126 15:04:43.624462 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 15:04:43 crc kubenswrapper[5200]: I1126 15:04:43.627204 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.148063 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.151420 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.315416 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54bf7754f7-7qmmv"] Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.337318 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.341378 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54bf7754f7-7qmmv"] Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.431127 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-dns-svc\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.431189 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkdsg\" (UniqueName: \"kubernetes.io/projected/263fe4d1-7768-425c-893f-5f4bc676a8e4-kube-api-access-wkdsg\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.431456 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-nb\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.431505 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-config\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.431547 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-sb\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.533988 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-dns-svc\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.534100 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkdsg\" (UniqueName: \"kubernetes.io/projected/263fe4d1-7768-425c-893f-5f4bc676a8e4-kube-api-access-wkdsg\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.534560 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-nb\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.534585 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-config\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.534618 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-sb\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.535201 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-dns-svc\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.535615 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-sb\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.535890 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-nb\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.537593 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-config\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.557437 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkdsg\" (UniqueName: \"kubernetes.io/projected/263fe4d1-7768-425c-893f-5f4bc676a8e4-kube-api-access-wkdsg\") pod \"dnsmasq-dns-54bf7754f7-7qmmv\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:44 crc kubenswrapper[5200]: I1126 15:04:44.667434 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:45 crc kubenswrapper[5200]: I1126 15:04:45.152710 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54bf7754f7-7qmmv"] Nov 26 15:04:46 crc kubenswrapper[5200]: I1126 15:04:46.167947 5200 generic.go:358] "Generic (PLEG): container finished" podID="263fe4d1-7768-425c-893f-5f4bc676a8e4" containerID="a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d" exitCode=0 Nov 26 15:04:46 crc kubenswrapper[5200]: I1126 15:04:46.168113 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" event={"ID":"263fe4d1-7768-425c-893f-5f4bc676a8e4","Type":"ContainerDied","Data":"a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d"} Nov 26 15:04:46 crc kubenswrapper[5200]: I1126 15:04:46.168567 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" event={"ID":"263fe4d1-7768-425c-893f-5f4bc676a8e4","Type":"ContainerStarted","Data":"09500fa7064f07e0949443821f06bafc443cfb718587cffed4e6a67c242649c1"} Nov 26 15:04:47 crc kubenswrapper[5200]: I1126 15:04:47.179161 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" event={"ID":"263fe4d1-7768-425c-893f-5f4bc676a8e4","Type":"ContainerStarted","Data":"9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75"} Nov 26 15:04:47 crc kubenswrapper[5200]: I1126 15:04:47.179619 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:47 crc kubenswrapper[5200]: I1126 15:04:47.201236 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" podStartSLOduration=3.201217351 podStartE2EDuration="3.201217351s" podCreationTimestamp="2025-11-26 15:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:47.198753286 +0000 UTC m=+5655.877955114" watchObservedRunningTime="2025-11-26 15:04:47.201217351 +0000 UTC m=+5655.880419159" Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.197546 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.312912 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b67fff77-czhrj"] Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.313507 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" podUID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerName="dnsmasq-dns" containerID="cri-o://f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3" gracePeriod=10 Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.449029 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" podUID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.64:5353: connect: connection refused" Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.933358 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.947580 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-sb\") pod \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.947632 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbkzv\" (UniqueName: \"kubernetes.io/projected/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-kube-api-access-sbkzv\") pod \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.947786 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-config\") pod \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.947809 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-dns-svc\") pod \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.947977 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-nb\") pod \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\" (UID: \"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf\") " Nov 26 15:04:53 crc kubenswrapper[5200]: I1126 15:04:53.957649 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-kube-api-access-sbkzv" (OuterVolumeSpecName: "kube-api-access-sbkzv") pod "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" (UID: "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf"). InnerVolumeSpecName "kube-api-access-sbkzv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.014378 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" (UID: "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.027274 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-config" (OuterVolumeSpecName: "config") pod "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" (UID: "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.028420 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" (UID: "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.048701 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" (UID: "493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.049899 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.049919 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.049928 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.049942 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.049957 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbkzv\" (UniqueName: \"kubernetes.io/projected/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf-kube-api-access-sbkzv\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.287448 5200 generic.go:358] "Generic (PLEG): container finished" podID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerID="f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3" exitCode=0 Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.287531 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.287544 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" event={"ID":"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf","Type":"ContainerDied","Data":"f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3"} Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.287605 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b67fff77-czhrj" event={"ID":"493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf","Type":"ContainerDied","Data":"d8faef96794eecc3ca9ec42d4c92609f27b6c369243ab0d9dbf9060d7854e61f"} Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.287629 5200 scope.go:117] "RemoveContainer" containerID="f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.309443 5200 scope.go:117] "RemoveContainer" containerID="aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.334537 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b67fff77-czhrj"] Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.345332 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b67fff77-czhrj"] Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.350492 5200 scope.go:117] "RemoveContainer" containerID="f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3" Nov 26 15:04:54 crc kubenswrapper[5200]: E1126 15:04:54.350859 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3\": container with ID starting with f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3 not found: ID does not exist" containerID="f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.350900 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3"} err="failed to get container status \"f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3\": rpc error: code = NotFound desc = could not find container \"f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3\": container with ID starting with f0234d1cdce27f23832f93226bbbb46527ae700b25b825fe710d3196416837b3 not found: ID does not exist" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.350929 5200 scope.go:117] "RemoveContainer" containerID="aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2" Nov 26 15:04:54 crc kubenswrapper[5200]: E1126 15:04:54.351157 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2\": container with ID starting with aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2 not found: ID does not exist" containerID="aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.351185 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2"} err="failed to get container status \"aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2\": rpc error: code = NotFound desc = could not find container \"aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2\": container with ID starting with aff37f1037c50780ab1b644a5bce6ef4c3ae67715f925d439ba88b6ad49cbaf2 not found: ID does not exist" Nov 26 15:04:54 crc kubenswrapper[5200]: I1126 15:04:54.985166 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" path="/var/lib/kubelet/pods/493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf/volumes" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.131554 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sdpqn"] Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.133377 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerName="dnsmasq-dns" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.133434 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerName="dnsmasq-dns" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.133516 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerName="init" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.133526 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerName="init" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.133781 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="493f1a59-ca3d-4ec9-9626-6b6eeb02dbcf" containerName="dnsmasq-dns" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.141159 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sdpqn"] Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.141289 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.197743 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9757e8-8ae6-4526-bb80-38e4decb481b-operator-scripts\") pod \"cinder-db-create-sdpqn\" (UID: \"ba9757e8-8ae6-4526-bb80-38e4decb481b\") " pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.197874 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572sz\" (UniqueName: \"kubernetes.io/projected/ba9757e8-8ae6-4526-bb80-38e4decb481b-kube-api-access-572sz\") pod \"cinder-db-create-sdpqn\" (UID: \"ba9757e8-8ae6-4526-bb80-38e4decb481b\") " pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.230656 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-ecf3-account-create-update-t9f9q"] Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.238960 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.240882 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-db-secret\"" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.243158 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ecf3-account-create-update-t9f9q"] Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.300934 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9757e8-8ae6-4526-bb80-38e4decb481b-operator-scripts\") pod \"cinder-db-create-sdpqn\" (UID: \"ba9757e8-8ae6-4526-bb80-38e4decb481b\") " pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.301013 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-operator-scripts\") pod \"cinder-ecf3-account-create-update-t9f9q\" (UID: \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\") " pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.301103 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-572sz\" (UniqueName: \"kubernetes.io/projected/ba9757e8-8ae6-4526-bb80-38e4decb481b-kube-api-access-572sz\") pod \"cinder-db-create-sdpqn\" (UID: \"ba9757e8-8ae6-4526-bb80-38e4decb481b\") " pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.301167 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfpt4\" (UniqueName: \"kubernetes.io/projected/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-kube-api-access-jfpt4\") pod \"cinder-ecf3-account-create-update-t9f9q\" (UID: \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\") " pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.301933 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9757e8-8ae6-4526-bb80-38e4decb481b-operator-scripts\") pod \"cinder-db-create-sdpqn\" (UID: \"ba9757e8-8ae6-4526-bb80-38e4decb481b\") " pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.327253 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-572sz\" (UniqueName: \"kubernetes.io/projected/ba9757e8-8ae6-4526-bb80-38e4decb481b-kube-api-access-572sz\") pod \"cinder-db-create-sdpqn\" (UID: \"ba9757e8-8ae6-4526-bb80-38e4decb481b\") " pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.402625 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-operator-scripts\") pod \"cinder-ecf3-account-create-update-t9f9q\" (UID: \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\") " pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.402792 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpt4\" (UniqueName: \"kubernetes.io/projected/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-kube-api-access-jfpt4\") pod \"cinder-ecf3-account-create-update-t9f9q\" (UID: \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\") " pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.403324 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-operator-scripts\") pod \"cinder-ecf3-account-create-update-t9f9q\" (UID: \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\") " pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.423321 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfpt4\" (UniqueName: \"kubernetes.io/projected/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-kube-api-access-jfpt4\") pod \"cinder-ecf3-account-create-update-t9f9q\" (UID: \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\") " pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.466257 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.556977 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.956397 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sdpqn"] Nov 26 15:04:56 crc kubenswrapper[5200]: I1126 15:04:56.962746 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:04:57 crc kubenswrapper[5200]: I1126 15:04:57.078271 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ecf3-account-create-update-t9f9q"] Nov 26 15:04:57 crc kubenswrapper[5200]: I1126 15:04:57.321393 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ecf3-account-create-update-t9f9q" event={"ID":"db638b8d-4a3d-4c79-a7e1-70be9299bd2e","Type":"ContainerStarted","Data":"1b107f87efe6d2a973ca9408110e56c1cd2d7e3ff0c6b1477ea7fa46c7badb0a"} Nov 26 15:04:57 crc kubenswrapper[5200]: I1126 15:04:57.321672 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ecf3-account-create-update-t9f9q" event={"ID":"db638b8d-4a3d-4c79-a7e1-70be9299bd2e","Type":"ContainerStarted","Data":"4ad661f757cebeb9bd323eb18ad371734bf5ce02e6ce75bbde961e26db050cd6"} Nov 26 15:04:57 crc kubenswrapper[5200]: I1126 15:04:57.323620 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sdpqn" event={"ID":"ba9757e8-8ae6-4526-bb80-38e4decb481b","Type":"ContainerStarted","Data":"014be205dd250d19db454eb3e8bedfbea45f4f5af81598a86843b5a0fac5f517"} Nov 26 15:04:57 crc kubenswrapper[5200]: I1126 15:04:57.323660 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sdpqn" event={"ID":"ba9757e8-8ae6-4526-bb80-38e4decb481b","Type":"ContainerStarted","Data":"423ed9aeb766536aabd1a954d6c942dc2f103443cba86339b6fbdb303844a80a"} Nov 26 15:04:57 crc kubenswrapper[5200]: I1126 15:04:57.347169 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ecf3-account-create-update-t9f9q" podStartSLOduration=1.347152516 podStartE2EDuration="1.347152516s" podCreationTimestamp="2025-11-26 15:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:04:57.345123042 +0000 UTC m=+5666.024324860" watchObservedRunningTime="2025-11-26 15:04:57.347152516 +0000 UTC m=+5666.026354334" Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.336920 5200 generic.go:358] "Generic (PLEG): container finished" podID="ba9757e8-8ae6-4526-bb80-38e4decb481b" containerID="014be205dd250d19db454eb3e8bedfbea45f4f5af81598a86843b5a0fac5f517" exitCode=0 Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.337306 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sdpqn" event={"ID":"ba9757e8-8ae6-4526-bb80-38e4decb481b","Type":"ContainerDied","Data":"014be205dd250d19db454eb3e8bedfbea45f4f5af81598a86843b5a0fac5f517"} Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.340596 5200 generic.go:358] "Generic (PLEG): container finished" podID="db638b8d-4a3d-4c79-a7e1-70be9299bd2e" containerID="1b107f87efe6d2a973ca9408110e56c1cd2d7e3ff0c6b1477ea7fa46c7badb0a" exitCode=0 Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.340638 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ecf3-account-create-update-t9f9q" event={"ID":"db638b8d-4a3d-4c79-a7e1-70be9299bd2e","Type":"ContainerDied","Data":"1b107f87efe6d2a973ca9408110e56c1cd2d7e3ff0c6b1477ea7fa46c7badb0a"} Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.808806 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.852610 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9757e8-8ae6-4526-bb80-38e4decb481b-operator-scripts\") pod \"ba9757e8-8ae6-4526-bb80-38e4decb481b\" (UID: \"ba9757e8-8ae6-4526-bb80-38e4decb481b\") " Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.852669 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-572sz\" (UniqueName: \"kubernetes.io/projected/ba9757e8-8ae6-4526-bb80-38e4decb481b-kube-api-access-572sz\") pod \"ba9757e8-8ae6-4526-bb80-38e4decb481b\" (UID: \"ba9757e8-8ae6-4526-bb80-38e4decb481b\") " Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.854316 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba9757e8-8ae6-4526-bb80-38e4decb481b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba9757e8-8ae6-4526-bb80-38e4decb481b" (UID: "ba9757e8-8ae6-4526-bb80-38e4decb481b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.862221 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba9757e8-8ae6-4526-bb80-38e4decb481b-kube-api-access-572sz" (OuterVolumeSpecName: "kube-api-access-572sz") pod "ba9757e8-8ae6-4526-bb80-38e4decb481b" (UID: "ba9757e8-8ae6-4526-bb80-38e4decb481b"). InnerVolumeSpecName "kube-api-access-572sz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.954789 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9757e8-8ae6-4526-bb80-38e4decb481b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:58 crc kubenswrapper[5200]: I1126 15:04:58.954823 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-572sz\" (UniqueName: \"kubernetes.io/projected/ba9757e8-8ae6-4526-bb80-38e4decb481b-kube-api-access-572sz\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.362155 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sdpqn" event={"ID":"ba9757e8-8ae6-4526-bb80-38e4decb481b","Type":"ContainerDied","Data":"423ed9aeb766536aabd1a954d6c942dc2f103443cba86339b6fbdb303844a80a"} Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.362228 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423ed9aeb766536aabd1a954d6c942dc2f103443cba86339b6fbdb303844a80a" Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.362313 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sdpqn" Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.788523 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.876357 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-operator-scripts\") pod \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\" (UID: \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\") " Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.876842 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfpt4\" (UniqueName: \"kubernetes.io/projected/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-kube-api-access-jfpt4\") pod \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\" (UID: \"db638b8d-4a3d-4c79-a7e1-70be9299bd2e\") " Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.876921 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db638b8d-4a3d-4c79-a7e1-70be9299bd2e" (UID: "db638b8d-4a3d-4c79-a7e1-70be9299bd2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.877575 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.883104 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-kube-api-access-jfpt4" (OuterVolumeSpecName: "kube-api-access-jfpt4") pod "db638b8d-4a3d-4c79-a7e1-70be9299bd2e" (UID: "db638b8d-4a3d-4c79-a7e1-70be9299bd2e"). InnerVolumeSpecName "kube-api-access-jfpt4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:04:59 crc kubenswrapper[5200]: I1126 15:04:59.979097 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jfpt4\" (UniqueName: \"kubernetes.io/projected/db638b8d-4a3d-4c79-a7e1-70be9299bd2e-kube-api-access-jfpt4\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:00 crc kubenswrapper[5200]: I1126 15:05:00.374260 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ecf3-account-create-update-t9f9q" event={"ID":"db638b8d-4a3d-4c79-a7e1-70be9299bd2e","Type":"ContainerDied","Data":"4ad661f757cebeb9bd323eb18ad371734bf5ce02e6ce75bbde961e26db050cd6"} Nov 26 15:05:00 crc kubenswrapper[5200]: I1126 15:05:00.375588 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ad661f757cebeb9bd323eb18ad371734bf5ce02e6ce75bbde961e26db050cd6" Nov 26 15:05:00 crc kubenswrapper[5200]: I1126 15:05:00.375227 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ecf3-account-create-update-t9f9q" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.486021 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-69jk8"] Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.488456 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba9757e8-8ae6-4526-bb80-38e4decb481b" containerName="mariadb-database-create" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.488495 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9757e8-8ae6-4526-bb80-38e4decb481b" containerName="mariadb-database-create" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.488532 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db638b8d-4a3d-4c79-a7e1-70be9299bd2e" containerName="mariadb-account-create-update" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.488544 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="db638b8d-4a3d-4c79-a7e1-70be9299bd2e" containerName="mariadb-account-create-update" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.488864 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba9757e8-8ae6-4526-bb80-38e4decb481b" containerName="mariadb-database-create" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.488904 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="db638b8d-4a3d-4c79-a7e1-70be9299bd2e" containerName="mariadb-account-create-update" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.498530 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.508119 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-cinder-dockercfg-blrxc\"" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.510167 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scripts\"" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.510802 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-config-data\"" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.534240 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-69jk8"] Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.618938 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-config-data\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.619014 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-db-sync-config-data\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.619491 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c70e5dae-5910-47ef-83b5-5cfcf024a27e-etc-machine-id\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.619735 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-scripts\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.619797 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-combined-ca-bundle\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.620068 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l55jt\" (UniqueName: \"kubernetes.io/projected/c70e5dae-5910-47ef-83b5-5cfcf024a27e-kube-api-access-l55jt\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.721917 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c70e5dae-5910-47ef-83b5-5cfcf024a27e-etc-machine-id\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.721990 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-scripts\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.722016 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-combined-ca-bundle\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.722064 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l55jt\" (UniqueName: \"kubernetes.io/projected/c70e5dae-5910-47ef-83b5-5cfcf024a27e-kube-api-access-l55jt\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.722137 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-config-data\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.722153 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-db-sync-config-data\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.723473 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c70e5dae-5910-47ef-83b5-5cfcf024a27e-etc-machine-id\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.729855 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-combined-ca-bundle\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.730199 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-db-sync-config-data\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.730298 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-scripts\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.741818 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-config-data\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.742607 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l55jt\" (UniqueName: \"kubernetes.io/projected/c70e5dae-5910-47ef-83b5-5cfcf024a27e-kube-api-access-l55jt\") pod \"cinder-db-sync-69jk8\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:01 crc kubenswrapper[5200]: I1126 15:05:01.835439 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:02 crc kubenswrapper[5200]: I1126 15:05:02.308298 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-69jk8"] Nov 26 15:05:02 crc kubenswrapper[5200]: I1126 15:05:02.442017 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-69jk8" event={"ID":"c70e5dae-5910-47ef-83b5-5cfcf024a27e","Type":"ContainerStarted","Data":"dcfe3ef9b08e4ec64f28bd3863f2a7ae2351ead69b3f79f19ae9f778a2b4ca69"} Nov 26 15:05:03 crc kubenswrapper[5200]: I1126 15:05:03.454361 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-69jk8" event={"ID":"c70e5dae-5910-47ef-83b5-5cfcf024a27e","Type":"ContainerStarted","Data":"80ca65d6b46b71f1284b057212fefbee1b899841c85115c98cae1ed6dcd50b96"} Nov 26 15:05:03 crc kubenswrapper[5200]: I1126 15:05:03.478031 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-69jk8" podStartSLOduration=2.477983314 podStartE2EDuration="2.477983314s" podCreationTimestamp="2025-11-26 15:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:03.474204474 +0000 UTC m=+5672.153406302" watchObservedRunningTime="2025-11-26 15:05:03.477983314 +0000 UTC m=+5672.157185142" Nov 26 15:05:07 crc kubenswrapper[5200]: I1126 15:05:07.504725 5200 generic.go:358] "Generic (PLEG): container finished" podID="c70e5dae-5910-47ef-83b5-5cfcf024a27e" containerID="80ca65d6b46b71f1284b057212fefbee1b899841c85115c98cae1ed6dcd50b96" exitCode=0 Nov 26 15:05:07 crc kubenswrapper[5200]: I1126 15:05:07.504791 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-69jk8" event={"ID":"c70e5dae-5910-47ef-83b5-5cfcf024a27e","Type":"ContainerDied","Data":"80ca65d6b46b71f1284b057212fefbee1b899841c85115c98cae1ed6dcd50b96"} Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.894715 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.965831 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-scripts\") pod \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.966121 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c70e5dae-5910-47ef-83b5-5cfcf024a27e-etc-machine-id\") pod \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.966205 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c70e5dae-5910-47ef-83b5-5cfcf024a27e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c70e5dae-5910-47ef-83b5-5cfcf024a27e" (UID: "c70e5dae-5910-47ef-83b5-5cfcf024a27e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.966349 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l55jt\" (UniqueName: \"kubernetes.io/projected/c70e5dae-5910-47ef-83b5-5cfcf024a27e-kube-api-access-l55jt\") pod \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.966379 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-config-data\") pod \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.966448 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-db-sync-config-data\") pod \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.966521 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-combined-ca-bundle\") pod \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\" (UID: \"c70e5dae-5910-47ef-83b5-5cfcf024a27e\") " Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.966956 5200 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c70e5dae-5910-47ef-83b5-5cfcf024a27e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.975612 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c70e5dae-5910-47ef-83b5-5cfcf024a27e" (UID: "c70e5dae-5910-47ef-83b5-5cfcf024a27e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.976235 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-scripts" (OuterVolumeSpecName: "scripts") pod "c70e5dae-5910-47ef-83b5-5cfcf024a27e" (UID: "c70e5dae-5910-47ef-83b5-5cfcf024a27e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:08 crc kubenswrapper[5200]: I1126 15:05:08.984882 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70e5dae-5910-47ef-83b5-5cfcf024a27e-kube-api-access-l55jt" (OuterVolumeSpecName: "kube-api-access-l55jt") pod "c70e5dae-5910-47ef-83b5-5cfcf024a27e" (UID: "c70e5dae-5910-47ef-83b5-5cfcf024a27e"). InnerVolumeSpecName "kube-api-access-l55jt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.028563 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c70e5dae-5910-47ef-83b5-5cfcf024a27e" (UID: "c70e5dae-5910-47ef-83b5-5cfcf024a27e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.037910 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-config-data" (OuterVolumeSpecName: "config-data") pod "c70e5dae-5910-47ef-83b5-5cfcf024a27e" (UID: "c70e5dae-5910-47ef-83b5-5cfcf024a27e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.070078 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l55jt\" (UniqueName: \"kubernetes.io/projected/c70e5dae-5910-47ef-83b5-5cfcf024a27e-kube-api-access-l55jt\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.070145 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.070166 5200 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.070183 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.070201 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c70e5dae-5910-47ef-83b5-5cfcf024a27e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.533828 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-69jk8" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.533870 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-69jk8" event={"ID":"c70e5dae-5910-47ef-83b5-5cfcf024a27e","Type":"ContainerDied","Data":"dcfe3ef9b08e4ec64f28bd3863f2a7ae2351ead69b3f79f19ae9f778a2b4ca69"} Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.533940 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcfe3ef9b08e4ec64f28bd3863f2a7ae2351ead69b3f79f19ae9f778a2b4ca69" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.939050 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd68489b9-gnwvm"] Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.940961 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c70e5dae-5910-47ef-83b5-5cfcf024a27e" containerName="cinder-db-sync" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.940982 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70e5dae-5910-47ef-83b5-5cfcf024a27e" containerName="cinder-db-sync" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.941232 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c70e5dae-5910-47ef-83b5-5cfcf024a27e" containerName="cinder-db-sync" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.955585 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.974275 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd68489b9-gnwvm"] Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.995149 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.995250 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-config\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.995408 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfq7w\" (UniqueName: \"kubernetes.io/projected/5813b9c1-50b0-4a0c-90c3-fa2de1398326-kube-api-access-wfq7w\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.995456 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-dns-svc\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:09 crc kubenswrapper[5200]: I1126 15:05:09.995584 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.097729 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.097823 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.097875 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-config\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.097988 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfq7w\" (UniqueName: \"kubernetes.io/projected/5813b9c1-50b0-4a0c-90c3-fa2de1398326-kube-api-access-wfq7w\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.098032 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-dns-svc\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.099274 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-dns-svc\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.104074 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-config\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.104498 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.104557 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.138533 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfq7w\" (UniqueName: \"kubernetes.io/projected/5813b9c1-50b0-4a0c-90c3-fa2de1398326-kube-api-access-wfq7w\") pod \"dnsmasq-dns-7bd68489b9-gnwvm\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.204179 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.247167 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.247324 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.250624 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-config-data\"" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.251091 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-api-config-data\"" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.251342 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scripts\"" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.251524 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-cinder-dockercfg-blrxc\"" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.287526 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.306132 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj8xl\" (UniqueName: \"kubernetes.io/projected/835e6a17-af89-438e-b46f-55d74c21d11e-kube-api-access-kj8xl\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.306226 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835e6a17-af89-438e-b46f-55d74c21d11e-logs\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.309983 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.310060 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data-custom\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.310120 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/835e6a17-af89-438e-b46f-55d74c21d11e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.310172 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-scripts\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.310286 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.411967 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.412120 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj8xl\" (UniqueName: \"kubernetes.io/projected/835e6a17-af89-438e-b46f-55d74c21d11e-kube-api-access-kj8xl\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.412167 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835e6a17-af89-438e-b46f-55d74c21d11e-logs\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.412205 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.412251 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data-custom\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.412296 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/835e6a17-af89-438e-b46f-55d74c21d11e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.412324 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-scripts\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.412674 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/835e6a17-af89-438e-b46f-55d74c21d11e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.413040 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835e6a17-af89-438e-b46f-55d74c21d11e-logs\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.417146 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.417484 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-scripts\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.418124 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data-custom\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.422565 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.457051 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj8xl\" (UniqueName: \"kubernetes.io/projected/835e6a17-af89-438e-b46f-55d74c21d11e-kube-api-access-kj8xl\") pod \"cinder-api-0\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.572924 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:05:10 crc kubenswrapper[5200]: I1126 15:05:10.826174 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd68489b9-gnwvm"] Nov 26 15:05:11 crc kubenswrapper[5200]: I1126 15:05:11.090283 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:11 crc kubenswrapper[5200]: I1126 15:05:11.564279 5200 generic.go:358] "Generic (PLEG): container finished" podID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" containerID="8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9" exitCode=0 Nov 26 15:05:11 crc kubenswrapper[5200]: I1126 15:05:11.565529 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" event={"ID":"5813b9c1-50b0-4a0c-90c3-fa2de1398326","Type":"ContainerDied","Data":"8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9"} Nov 26 15:05:11 crc kubenswrapper[5200]: I1126 15:05:11.565605 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" event={"ID":"5813b9c1-50b0-4a0c-90c3-fa2de1398326","Type":"ContainerStarted","Data":"9a60f84e677758d575899b934c20bb95d8cf48e1fd945668f2cdbd7c8886ba10"} Nov 26 15:05:11 crc kubenswrapper[5200]: I1126 15:05:11.574962 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"835e6a17-af89-438e-b46f-55d74c21d11e","Type":"ContainerStarted","Data":"8bcae0c23d9e439a7a2bcc2347e9729c5119376b42f844ec40065d8bcb9c24f7"} Nov 26 15:05:12 crc kubenswrapper[5200]: I1126 15:05:12.587532 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"835e6a17-af89-438e-b46f-55d74c21d11e","Type":"ContainerStarted","Data":"a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2"} Nov 26 15:05:12 crc kubenswrapper[5200]: I1126 15:05:12.591522 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" event={"ID":"5813b9c1-50b0-4a0c-90c3-fa2de1398326","Type":"ContainerStarted","Data":"3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2"} Nov 26 15:05:12 crc kubenswrapper[5200]: I1126 15:05:12.591843 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:12 crc kubenswrapper[5200]: I1126 15:05:12.623631 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" podStartSLOduration=3.6235996139999997 podStartE2EDuration="3.623599614s" podCreationTimestamp="2025-11-26 15:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:12.607415677 +0000 UTC m=+5681.286617505" watchObservedRunningTime="2025-11-26 15:05:12.623599614 +0000 UTC m=+5681.302801432" Nov 26 15:05:13 crc kubenswrapper[5200]: I1126 15:05:13.602960 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"835e6a17-af89-438e-b46f-55d74c21d11e","Type":"ContainerStarted","Data":"1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63"} Nov 26 15:05:13 crc kubenswrapper[5200]: I1126 15:05:13.603808 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/cinder-api-0" Nov 26 15:05:13 crc kubenswrapper[5200]: I1126 15:05:13.631509 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.63148265 podStartE2EDuration="3.63148265s" podCreationTimestamp="2025-11-26 15:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:13.621306691 +0000 UTC m=+5682.300508579" watchObservedRunningTime="2025-11-26 15:05:13.63148265 +0000 UTC m=+5682.310684468" Nov 26 15:05:18 crc kubenswrapper[5200]: I1126 15:05:18.605359 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:05:18 crc kubenswrapper[5200]: I1126 15:05:18.709181 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54bf7754f7-7qmmv"] Nov 26 15:05:18 crc kubenswrapper[5200]: I1126 15:05:18.709482 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" podUID="263fe4d1-7768-425c-893f-5f4bc676a8e4" containerName="dnsmasq-dns" containerID="cri-o://9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75" gracePeriod=10 Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.300708 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.431319 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-nb\") pod \"263fe4d1-7768-425c-893f-5f4bc676a8e4\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.431526 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-config\") pod \"263fe4d1-7768-425c-893f-5f4bc676a8e4\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.431562 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-dns-svc\") pod \"263fe4d1-7768-425c-893f-5f4bc676a8e4\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.431600 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-sb\") pod \"263fe4d1-7768-425c-893f-5f4bc676a8e4\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.431669 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkdsg\" (UniqueName: \"kubernetes.io/projected/263fe4d1-7768-425c-893f-5f4bc676a8e4-kube-api-access-wkdsg\") pod \"263fe4d1-7768-425c-893f-5f4bc676a8e4\" (UID: \"263fe4d1-7768-425c-893f-5f4bc676a8e4\") " Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.440457 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263fe4d1-7768-425c-893f-5f4bc676a8e4-kube-api-access-wkdsg" (OuterVolumeSpecName: "kube-api-access-wkdsg") pod "263fe4d1-7768-425c-893f-5f4bc676a8e4" (UID: "263fe4d1-7768-425c-893f-5f4bc676a8e4"). InnerVolumeSpecName "kube-api-access-wkdsg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.479305 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "263fe4d1-7768-425c-893f-5f4bc676a8e4" (UID: "263fe4d1-7768-425c-893f-5f4bc676a8e4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.480438 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "263fe4d1-7768-425c-893f-5f4bc676a8e4" (UID: "263fe4d1-7768-425c-893f-5f4bc676a8e4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.480834 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-config" (OuterVolumeSpecName: "config") pod "263fe4d1-7768-425c-893f-5f4bc676a8e4" (UID: "263fe4d1-7768-425c-893f-5f4bc676a8e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.482454 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "263fe4d1-7768-425c-893f-5f4bc676a8e4" (UID: "263fe4d1-7768-425c-893f-5f4bc676a8e4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.534099 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wkdsg\" (UniqueName: \"kubernetes.io/projected/263fe4d1-7768-425c-893f-5f4bc676a8e4-kube-api-access-wkdsg\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.534131 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.534141 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.534151 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.534158 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/263fe4d1-7768-425c-893f-5f4bc676a8e4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.678275 5200 generic.go:358] "Generic (PLEG): container finished" podID="263fe4d1-7768-425c-893f-5f4bc676a8e4" containerID="9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75" exitCode=0 Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.678346 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" event={"ID":"263fe4d1-7768-425c-893f-5f4bc676a8e4","Type":"ContainerDied","Data":"9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75"} Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.678377 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.678415 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54bf7754f7-7qmmv" event={"ID":"263fe4d1-7768-425c-893f-5f4bc676a8e4","Type":"ContainerDied","Data":"09500fa7064f07e0949443821f06bafc443cfb718587cffed4e6a67c242649c1"} Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.678436 5200 scope.go:117] "RemoveContainer" containerID="9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.699307 5200 scope.go:117] "RemoveContainer" containerID="a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.717953 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54bf7754f7-7qmmv"] Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.728857 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54bf7754f7-7qmmv"] Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.736990 5200 scope.go:117] "RemoveContainer" containerID="9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75" Nov 26 15:05:19 crc kubenswrapper[5200]: E1126 15:05:19.737628 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75\": container with ID starting with 9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75 not found: ID does not exist" containerID="9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.737708 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75"} err="failed to get container status \"9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75\": rpc error: code = NotFound desc = could not find container \"9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75\": container with ID starting with 9f7cad0f58be75e6a76827a1dd492c1e826f74de53201bff9731b54c66ef5b75 not found: ID does not exist" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.737738 5200 scope.go:117] "RemoveContainer" containerID="a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d" Nov 26 15:05:19 crc kubenswrapper[5200]: E1126 15:05:19.738062 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d\": container with ID starting with a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d not found: ID does not exist" containerID="a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d" Nov 26 15:05:19 crc kubenswrapper[5200]: I1126 15:05:19.738090 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d"} err="failed to get container status \"a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d\": rpc error: code = NotFound desc = could not find container \"a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d\": container with ID starting with a1cb7c61eb4ee74c95c2635f94b6a7563f1c55a4caec6088bc45dc8ba826549d not found: ID does not exist" Nov 26 15:05:20 crc kubenswrapper[5200]: I1126 15:05:20.982306 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="263fe4d1-7768-425c-893f-5f4bc676a8e4" path="/var/lib/kubelet/pods/263fe4d1-7768-425c-893f-5f4bc676a8e4/volumes" Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.814907 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.815862 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-log" containerID="cri-o://c3acc206e4f3310bb235932e9d2defbec84e2d81dea1df2b6587f519882225b7" gracePeriod=30 Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.815914 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-api" containerID="cri-o://e2f6cf12989b01dc9fb5623c1c74e99f6c35f90c63bb8868807947c16746e8ce" gracePeriod=30 Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.831945 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.832517 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a235b77d-17ac-4633-9030-940179d31fa9" containerName="nova-scheduler-scheduler" containerID="cri-o://4a42c3b223f3a3802d5e2db952fd3d6969fdf5998faac4b40638aa0d5b96d08b" gracePeriod=30 Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.846938 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.849991 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="96f31a1f-212f-4e9d-bda1-91f2bde393ba" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb" gracePeriod=30 Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.860148 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.860473 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-log" containerID="cri-o://280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08" gracePeriod=30 Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.861102 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-metadata" containerID="cri-o://bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa" gracePeriod=30 Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.864952 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.865228 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ff3b780a-3460-455b-9bdc-070f1c747d7a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ea853a65412e0624d2988d49e4cdecbe87ad0112d998218517e34f11724c848b" gracePeriod=30 Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.937581 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:05:21 crc kubenswrapper[5200]: I1126 15:05:21.938235 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="2c678887-cc1d-4694-9cd6-8fbacccacd07" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f" gracePeriod=30 Nov 26 15:05:22 crc kubenswrapper[5200]: E1126 15:05:22.012094 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 15:05:22 crc kubenswrapper[5200]: E1126 15:05:22.014604 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 15:05:22 crc kubenswrapper[5200]: E1126 15:05:22.019213 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 15:05:22 crc kubenswrapper[5200]: E1126 15:05:22.019291 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="96f31a1f-212f-4e9d-bda1-91f2bde393ba" containerName="nova-cell0-conductor-conductor" probeResult="unknown" Nov 26 15:05:22 crc kubenswrapper[5200]: E1126 15:05:22.050662 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a42c3b223f3a3802d5e2db952fd3d6969fdf5998faac4b40638aa0d5b96d08b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:05:22 crc kubenswrapper[5200]: E1126 15:05:22.052157 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a42c3b223f3a3802d5e2db952fd3d6969fdf5998faac4b40638aa0d5b96d08b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:05:22 crc kubenswrapper[5200]: E1126 15:05:22.053866 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a42c3b223f3a3802d5e2db952fd3d6969fdf5998faac4b40638aa0d5b96d08b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:05:22 crc kubenswrapper[5200]: E1126 15:05:22.053932 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a235b77d-17ac-4633-9030-940179d31fa9" containerName="nova-scheduler-scheduler" probeResult="unknown" Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.724620 5200 generic.go:358] "Generic (PLEG): container finished" podID="ff3b780a-3460-455b-9bdc-070f1c747d7a" containerID="ea853a65412e0624d2988d49e4cdecbe87ad0112d998218517e34f11724c848b" exitCode=0 Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.724790 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ff3b780a-3460-455b-9bdc-070f1c747d7a","Type":"ContainerDied","Data":"ea853a65412e0624d2988d49e4cdecbe87ad0112d998218517e34f11724c848b"} Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.729367 5200 generic.go:358] "Generic (PLEG): container finished" podID="587cd81d-cae9-4cb3-b029-c9513934be26" containerID="280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08" exitCode=143 Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.729863 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"587cd81d-cae9-4cb3-b029-c9513934be26","Type":"ContainerDied","Data":"280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08"} Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.732117 5200 generic.go:358] "Generic (PLEG): container finished" podID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerID="c3acc206e4f3310bb235932e9d2defbec84e2d81dea1df2b6587f519882225b7" exitCode=143 Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.732250 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0180cef3-1322-4a69-a824-3fa1e2fc4de9","Type":"ContainerDied","Data":"c3acc206e4f3310bb235932e9d2defbec84e2d81dea1df2b6587f519882225b7"} Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.862559 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.926405 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-combined-ca-bundle\") pod \"ff3b780a-3460-455b-9bdc-070f1c747d7a\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.926565 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4n5f\" (UniqueName: \"kubernetes.io/projected/ff3b780a-3460-455b-9bdc-070f1c747d7a-kube-api-access-z4n5f\") pod \"ff3b780a-3460-455b-9bdc-070f1c747d7a\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.926599 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-config-data\") pod \"ff3b780a-3460-455b-9bdc-070f1c747d7a\" (UID: \"ff3b780a-3460-455b-9bdc-070f1c747d7a\") " Nov 26 15:05:22 crc kubenswrapper[5200]: I1126 15:05:22.969709 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3b780a-3460-455b-9bdc-070f1c747d7a-kube-api-access-z4n5f" (OuterVolumeSpecName: "kube-api-access-z4n5f") pod "ff3b780a-3460-455b-9bdc-070f1c747d7a" (UID: "ff3b780a-3460-455b-9bdc-070f1c747d7a"). InnerVolumeSpecName "kube-api-access-z4n5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.011146 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-config-data" (OuterVolumeSpecName: "config-data") pod "ff3b780a-3460-455b-9bdc-070f1c747d7a" (UID: "ff3b780a-3460-455b-9bdc-070f1c747d7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.021584 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff3b780a-3460-455b-9bdc-070f1c747d7a" (UID: "ff3b780a-3460-455b-9bdc-070f1c747d7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.030042 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.030104 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3b780a-3460-455b-9bdc-070f1c747d7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.030121 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4n5f\" (UniqueName: \"kubernetes.io/projected/ff3b780a-3460-455b-9bdc-070f1c747d7a-kube-api-access-z4n5f\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.181499 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.685050 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.749247 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ff3b780a-3460-455b-9bdc-070f1c747d7a","Type":"ContainerDied","Data":"82b3ff54e4e60a52bca6b62f69d54dbd29c1152200d6da48ab3d21859795df21"} Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.749340 5200 scope.go:117] "RemoveContainer" containerID="ea853a65412e0624d2988d49e4cdecbe87ad0112d998218517e34f11724c848b" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.749258 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.754775 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-combined-ca-bundle\") pod \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.754858 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc7l8\" (UniqueName: \"kubernetes.io/projected/96f31a1f-212f-4e9d-bda1-91f2bde393ba-kube-api-access-zc7l8\") pod \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.754957 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-config-data\") pod \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\" (UID: \"96f31a1f-212f-4e9d-bda1-91f2bde393ba\") " Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.756964 5200 generic.go:358] "Generic (PLEG): container finished" podID="96f31a1f-212f-4e9d-bda1-91f2bde393ba" containerID="9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb" exitCode=0 Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.757050 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"96f31a1f-212f-4e9d-bda1-91f2bde393ba","Type":"ContainerDied","Data":"9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb"} Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.757090 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"96f31a1f-212f-4e9d-bda1-91f2bde393ba","Type":"ContainerDied","Data":"42e32d040d711d1333f5057d9aeea7c5437b913ea1d236e881fae1d6f4193b8a"} Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.757170 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.779631 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f31a1f-212f-4e9d-bda1-91f2bde393ba-kube-api-access-zc7l8" (OuterVolumeSpecName: "kube-api-access-zc7l8") pod "96f31a1f-212f-4e9d-bda1-91f2bde393ba" (UID: "96f31a1f-212f-4e9d-bda1-91f2bde393ba"). InnerVolumeSpecName "kube-api-access-zc7l8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.789349 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96f31a1f-212f-4e9d-bda1-91f2bde393ba" (UID: "96f31a1f-212f-4e9d-bda1-91f2bde393ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.792775 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-config-data" (OuterVolumeSpecName: "config-data") pod "96f31a1f-212f-4e9d-bda1-91f2bde393ba" (UID: "96f31a1f-212f-4e9d-bda1-91f2bde393ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.803752 5200 scope.go:117] "RemoveContainer" containerID="9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.806007 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.837511 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.856937 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.857230 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zc7l8\" (UniqueName: \"kubernetes.io/projected/96f31a1f-212f-4e9d-bda1-91f2bde393ba-kube-api-access-zc7l8\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.857310 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f31a1f-212f-4e9d-bda1-91f2bde393ba-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.857802 5200 scope.go:117] "RemoveContainer" containerID="9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.859916 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:05:23 crc kubenswrapper[5200]: E1126 15:05:23.861349 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb\": container with ID starting with 9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb not found: ID does not exist" containerID="9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.861410 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb"} err="failed to get container status \"9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb\": rpc error: code = NotFound desc = could not find container \"9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb\": container with ID starting with 9725eae9e240dc050a2088356ff0b116a75553a4ec9dca6fd58efb7dff242fbb not found: ID does not exist" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.861995 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263fe4d1-7768-425c-893f-5f4bc676a8e4" containerName="init" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862039 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="263fe4d1-7768-425c-893f-5f4bc676a8e4" containerName="init" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862071 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff3b780a-3460-455b-9bdc-070f1c747d7a" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862095 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3b780a-3460-455b-9bdc-070f1c747d7a" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862121 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96f31a1f-212f-4e9d-bda1-91f2bde393ba" containerName="nova-cell0-conductor-conductor" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862127 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f31a1f-212f-4e9d-bda1-91f2bde393ba" containerName="nova-cell0-conductor-conductor" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862198 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263fe4d1-7768-425c-893f-5f4bc676a8e4" containerName="dnsmasq-dns" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862204 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="263fe4d1-7768-425c-893f-5f4bc676a8e4" containerName="dnsmasq-dns" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862565 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="263fe4d1-7768-425c-893f-5f4bc676a8e4" containerName="dnsmasq-dns" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862593 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff3b780a-3460-455b-9bdc-070f1c747d7a" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.862608 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="96f31a1f-212f-4e9d-bda1-91f2bde393ba" containerName="nova-cell0-conductor-conductor" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.916357 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.916546 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.921893 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-novncproxy-config-data\"" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.959179 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9203d723-6e51-4b7a-a443-3e9983dc3fa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.959639 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d7jf\" (UniqueName: \"kubernetes.io/projected/9203d723-6e51-4b7a-a443-3e9983dc3fa1-kube-api-access-5d7jf\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:23 crc kubenswrapper[5200]: I1126 15:05:23.959854 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9203d723-6e51-4b7a-a443-3e9983dc3fa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.063457 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d7jf\" (UniqueName: \"kubernetes.io/projected/9203d723-6e51-4b7a-a443-3e9983dc3fa1-kube-api-access-5d7jf\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.063542 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9203d723-6e51-4b7a-a443-3e9983dc3fa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.063729 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9203d723-6e51-4b7a-a443-3e9983dc3fa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.073125 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9203d723-6e51-4b7a-a443-3e9983dc3fa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.080662 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d7jf\" (UniqueName: \"kubernetes.io/projected/9203d723-6e51-4b7a-a443-3e9983dc3fa1-kube-api-access-5d7jf\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.087770 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9203d723-6e51-4b7a-a443-3e9983dc3fa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9203d723-6e51-4b7a-a443-3e9983dc3fa1\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.180520 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.204442 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.218646 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.248585 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.256016 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.256206 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.260165 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-config-data\"" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.382830 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.383199 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4cbs\" (UniqueName: \"kubernetes.io/projected/49ca5aba-a47e-4a54-97d1-bde52324fe21-kube-api-access-m4cbs\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.383330 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.486311 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.486922 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cbs\" (UniqueName: \"kubernetes.io/projected/49ca5aba-a47e-4a54-97d1-bde52324fe21-kube-api-access-m4cbs\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.486992 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.502062 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.503592 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.515675 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cbs\" (UniqueName: \"kubernetes.io/projected/49ca5aba-a47e-4a54-97d1-bde52324fe21-kube-api-access-m4cbs\") pod \"nova-cell0-conductor-0\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: E1126 15:05:24.593287 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 15:05:24 crc kubenswrapper[5200]: E1126 15:05:24.595313 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 15:05:24 crc kubenswrapper[5200]: E1126 15:05:24.600355 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 15:05:24 crc kubenswrapper[5200]: E1126 15:05:24.600399 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="2c678887-cc1d-4694-9cd6-8fbacccacd07" containerName="nova-cell1-conductor-conductor" probeResult="unknown" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.691351 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:24 crc kubenswrapper[5200]: I1126 15:05:24.791536 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:24.999540 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f31a1f-212f-4e9d-bda1-91f2bde393ba" path="/var/lib/kubelet/pods/96f31a1f-212f-4e9d-bda1-91f2bde393ba/volumes" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.000524 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3b780a-3460-455b-9bdc-070f1c747d7a" path="/var/lib/kubelet/pods/ff3b780a-3460-455b-9bdc-070f1c747d7a/volumes" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.018062 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": read tcp 10.217.0.2:55534->10.217.1.71:8775: read: connection reset by peer" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.018556 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": read tcp 10.217.0.2:55550->10.217.1.71:8775: read: connection reset by peer" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.211070 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.492931 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.525028 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-combined-ca-bundle\") pod \"2c678887-cc1d-4694-9cd6-8fbacccacd07\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.525170 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-config-data\") pod \"2c678887-cc1d-4694-9cd6-8fbacccacd07\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.557874 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-config-data" (OuterVolumeSpecName: "config-data") pod "2c678887-cc1d-4694-9cd6-8fbacccacd07" (UID: "2c678887-cc1d-4694-9cd6-8fbacccacd07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.609025 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c678887-cc1d-4694-9cd6-8fbacccacd07" (UID: "2c678887-cc1d-4694-9cd6-8fbacccacd07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.627119 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmmvd\" (UniqueName: \"kubernetes.io/projected/2c678887-cc1d-4694-9cd6-8fbacccacd07-kube-api-access-vmmvd\") pod \"2c678887-cc1d-4694-9cd6-8fbacccacd07\" (UID: \"2c678887-cc1d-4694-9cd6-8fbacccacd07\") " Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.627567 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.627580 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c678887-cc1d-4694-9cd6-8fbacccacd07-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.632497 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c678887-cc1d-4694-9cd6-8fbacccacd07-kube-api-access-vmmvd" (OuterVolumeSpecName: "kube-api-access-vmmvd") pod "2c678887-cc1d-4694-9cd6-8fbacccacd07" (UID: "2c678887-cc1d-4694-9cd6-8fbacccacd07"). InnerVolumeSpecName "kube-api-access-vmmvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.636941 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.728551 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-config-data\") pod \"587cd81d-cae9-4cb3-b029-c9513934be26\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.728885 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587cd81d-cae9-4cb3-b029-c9513934be26-logs\") pod \"587cd81d-cae9-4cb3-b029-c9513934be26\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.728944 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4t4z\" (UniqueName: \"kubernetes.io/projected/587cd81d-cae9-4cb3-b029-c9513934be26-kube-api-access-n4t4z\") pod \"587cd81d-cae9-4cb3-b029-c9513934be26\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.728972 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-combined-ca-bundle\") pod \"587cd81d-cae9-4cb3-b029-c9513934be26\" (UID: \"587cd81d-cae9-4cb3-b029-c9513934be26\") " Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.729920 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587cd81d-cae9-4cb3-b029-c9513934be26-logs" (OuterVolumeSpecName: "logs") pod "587cd81d-cae9-4cb3-b029-c9513934be26" (UID: "587cd81d-cae9-4cb3-b029-c9513934be26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.731099 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vmmvd\" (UniqueName: \"kubernetes.io/projected/2c678887-cc1d-4694-9cd6-8fbacccacd07-kube-api-access-vmmvd\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.731128 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587cd81d-cae9-4cb3-b029-c9513934be26-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.735517 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587cd81d-cae9-4cb3-b029-c9513934be26-kube-api-access-n4t4z" (OuterVolumeSpecName: "kube-api-access-n4t4z") pod "587cd81d-cae9-4cb3-b029-c9513934be26" (UID: "587cd81d-cae9-4cb3-b029-c9513934be26"). InnerVolumeSpecName "kube-api-access-n4t4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.776167 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "587cd81d-cae9-4cb3-b029-c9513934be26" (UID: "587cd81d-cae9-4cb3-b029-c9513934be26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.795576 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-config-data" (OuterVolumeSpecName: "config-data") pod "587cd81d-cae9-4cb3-b029-c9513934be26" (UID: "587cd81d-cae9-4cb3-b029-c9513934be26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.802876 5200 generic.go:358] "Generic (PLEG): container finished" podID="a235b77d-17ac-4633-9030-940179d31fa9" containerID="4a42c3b223f3a3802d5e2db952fd3d6969fdf5998faac4b40638aa0d5b96d08b" exitCode=0 Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.802911 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a235b77d-17ac-4633-9030-940179d31fa9","Type":"ContainerDied","Data":"4a42c3b223f3a3802d5e2db952fd3d6969fdf5998faac4b40638aa0d5b96d08b"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.813408 5200 generic.go:358] "Generic (PLEG): container finished" podID="587cd81d-cae9-4cb3-b029-c9513934be26" containerID="bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa" exitCode=0 Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.813555 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"587cd81d-cae9-4cb3-b029-c9513934be26","Type":"ContainerDied","Data":"bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.813584 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"587cd81d-cae9-4cb3-b029-c9513934be26","Type":"ContainerDied","Data":"78aee3a198345618d63855dd670e9e8e532e0a7d5ff049b012d238afab96e74e"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.813604 5200 scope.go:117] "RemoveContainer" containerID="bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.813764 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.834995 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.835023 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4t4z\" (UniqueName: \"kubernetes.io/projected/587cd81d-cae9-4cb3-b029-c9513934be26-kube-api-access-n4t4z\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.835032 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587cd81d-cae9-4cb3-b029-c9513934be26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.835379 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"49ca5aba-a47e-4a54-97d1-bde52324fe21","Type":"ContainerStarted","Data":"f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.835422 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"49ca5aba-a47e-4a54-97d1-bde52324fe21","Type":"ContainerStarted","Data":"b1d476c4b03c41087fbe1ab4d5d1a9bfa9a8e449b2a41552625c38e0e42bf21d"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.836342 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.871213 5200 generic.go:358] "Generic (PLEG): container finished" podID="2c678887-cc1d-4694-9cd6-8fbacccacd07" containerID="c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f" exitCode=0 Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.871478 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c678887-cc1d-4694-9cd6-8fbacccacd07","Type":"ContainerDied","Data":"c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.871514 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c678887-cc1d-4694-9cd6-8fbacccacd07","Type":"ContainerDied","Data":"976f85cb87751213d380aede4ddc6583eedef61b2b11ea0b079506b1c75fe3e1"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.871620 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.882162 5200 scope.go:117] "RemoveContainer" containerID="280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.890785 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.914196 5200 generic.go:358] "Generic (PLEG): container finished" podID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerID="e2f6cf12989b01dc9fb5623c1c74e99f6c35f90c63bb8868807947c16746e8ce" exitCode=0 Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.914311 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0180cef3-1322-4a69-a824-3fa1e2fc4de9","Type":"ContainerDied","Data":"e2f6cf12989b01dc9fb5623c1c74e99f6c35f90c63bb8868807947c16746e8ce"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.923583 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.932049 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9203d723-6e51-4b7a-a443-3e9983dc3fa1","Type":"ContainerStarted","Data":"f7bff137af1b13e944d8e279056bc594c92d5cc6f13cc9b968368216b23d0e56"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.932097 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9203d723-6e51-4b7a-a443-3e9983dc3fa1","Type":"ContainerStarted","Data":"ebbc7555955ea07fff3b0f53f48c94c117e0127991e6e92cbb1299cab324e5ec"} Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.942950 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.944644 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-metadata" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.944669 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-metadata" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.944727 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-log" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.944737 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-log" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.944791 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c678887-cc1d-4694-9cd6-8fbacccacd07" containerName="nova-cell1-conductor-conductor" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.944800 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c678887-cc1d-4694-9cd6-8fbacccacd07" containerName="nova-cell1-conductor-conductor" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.944995 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c678887-cc1d-4694-9cd6-8fbacccacd07" containerName="nova-cell1-conductor-conductor" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.945013 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-log" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.945020 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" containerName="nova-metadata-metadata" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.961436 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.961414776 podStartE2EDuration="1.961414776s" podCreationTimestamp="2025-11-26 15:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:25.931194388 +0000 UTC m=+5694.610396216" watchObservedRunningTime="2025-11-26 15:05:25.961414776 +0000 UTC m=+5694.640616594" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.962980 5200 scope.go:117] "RemoveContainer" containerID="bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa" Nov 26 15:05:25 crc kubenswrapper[5200]: E1126 15:05:25.976292 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa\": container with ID starting with bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa not found: ID does not exist" containerID="bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.976350 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa"} err="failed to get container status \"bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa\": rpc error: code = NotFound desc = could not find container \"bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa\": container with ID starting with bdac807350bea725e8f5d593b8e57442466df09179c02a06acb6017972473afa not found: ID does not exist" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.976376 5200 scope.go:117] "RemoveContainer" containerID="280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.977032 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.977812 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:05:25 crc kubenswrapper[5200]: I1126 15:05:25.989288 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:05:26 crc kubenswrapper[5200]: E1126 15:05:26.002379 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08\": container with ID starting with 280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08 not found: ID does not exist" containerID="280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.002472 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08"} err="failed to get container status \"280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08\": rpc error: code = NotFound desc = could not find container \"280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08\": container with ID starting with 280c99504dd4047626953bb6e20517df05ab5118c75c7c94ef1e096a0e940f08 not found: ID does not exist" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.002541 5200 scope.go:117] "RemoveContainer" containerID="c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.006446 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.017860 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.024129 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.035404 5200 scope.go:117] "RemoveContainer" containerID="c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.036191 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.036169159 podStartE2EDuration="3.036169159s" podCreationTimestamp="2025-11-26 15:05:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:26.036008105 +0000 UTC m=+5694.715209923" watchObservedRunningTime="2025-11-26 15:05:26.036169159 +0000 UTC m=+5694.715370977" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.039232 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-config-data\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.039281 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea5bf46-9089-43cc-9a63-552c2dbc5347-logs\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.039324 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.039420 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl8h8\" (UniqueName: \"kubernetes.io/projected/7ea5bf46-9089-43cc-9a63-552c2dbc5347-kube-api-access-sl8h8\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: E1126 15:05:26.040103 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f\": container with ID starting with c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f not found: ID does not exist" containerID="c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.040155 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f"} err="failed to get container status \"c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f\": rpc error: code = NotFound desc = could not find container \"c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f\": container with ID starting with c01c67aad141e4d68591a6bc4e3d1d490e9ff2978365976f86a2f3f4b1f0881f not found: ID does not exist" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.046048 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.046198 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.048320 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-config-data\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.072301 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.140970 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s275x\" (UniqueName: \"kubernetes.io/projected/a235b77d-17ac-4633-9030-940179d31fa9-kube-api-access-s275x\") pod \"a235b77d-17ac-4633-9030-940179d31fa9\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141053 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-combined-ca-bundle\") pod \"a235b77d-17ac-4633-9030-940179d31fa9\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141138 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-config-data\") pod \"a235b77d-17ac-4633-9030-940179d31fa9\" (UID: \"a235b77d-17ac-4633-9030-940179d31fa9\") " Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141274 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141358 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141390 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-config-data\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141417 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea5bf46-9089-43cc-9a63-552c2dbc5347-logs\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141443 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141510 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vszjk\" (UniqueName: \"kubernetes.io/projected/bd5d470d-750f-4f42-a88a-1518b5cb9e15-kube-api-access-vszjk\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.141533 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl8h8\" (UniqueName: \"kubernetes.io/projected/7ea5bf46-9089-43cc-9a63-552c2dbc5347-kube-api-access-sl8h8\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.143885 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea5bf46-9089-43cc-9a63-552c2dbc5347-logs\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.159540 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.160744 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a235b77d-17ac-4633-9030-940179d31fa9-kube-api-access-s275x" (OuterVolumeSpecName: "kube-api-access-s275x") pod "a235b77d-17ac-4633-9030-940179d31fa9" (UID: "a235b77d-17ac-4633-9030-940179d31fa9"). InnerVolumeSpecName "kube-api-access-s275x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.173542 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-config-data\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.175678 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl8h8\" (UniqueName: \"kubernetes.io/projected/7ea5bf46-9089-43cc-9a63-552c2dbc5347-kube-api-access-sl8h8\") pod \"nova-metadata-0\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.179500 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a235b77d-17ac-4633-9030-940179d31fa9" (UID: "a235b77d-17ac-4633-9030-940179d31fa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.205899 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-config-data" (OuterVolumeSpecName: "config-data") pod "a235b77d-17ac-4633-9030-940179d31fa9" (UID: "a235b77d-17ac-4633-9030-940179d31fa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.242946 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vszjk\" (UniqueName: \"kubernetes.io/projected/bd5d470d-750f-4f42-a88a-1518b5cb9e15-kube-api-access-vszjk\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.243011 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.243088 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.243195 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s275x\" (UniqueName: \"kubernetes.io/projected/a235b77d-17ac-4633-9030-940179d31fa9-kube-api-access-s275x\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.243210 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.243218 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a235b77d-17ac-4633-9030-940179d31fa9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.246722 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.248454 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.259103 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.267211 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vszjk\" (UniqueName: \"kubernetes.io/projected/bd5d470d-750f-4f42-a88a-1518b5cb9e15-kube-api-access-vszjk\") pod \"nova-cell1-conductor-0\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.348009 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p8t8\" (UniqueName: \"kubernetes.io/projected/0180cef3-1322-4a69-a824-3fa1e2fc4de9-kube-api-access-6p8t8\") pod \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.348135 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-config-data\") pod \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.348265 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0180cef3-1322-4a69-a824-3fa1e2fc4de9-logs\") pod \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.348357 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-combined-ca-bundle\") pod \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\" (UID: \"0180cef3-1322-4a69-a824-3fa1e2fc4de9\") " Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.351840 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0180cef3-1322-4a69-a824-3fa1e2fc4de9-logs" (OuterVolumeSpecName: "logs") pod "0180cef3-1322-4a69-a824-3fa1e2fc4de9" (UID: "0180cef3-1322-4a69-a824-3fa1e2fc4de9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.362349 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0180cef3-1322-4a69-a824-3fa1e2fc4de9-kube-api-access-6p8t8" (OuterVolumeSpecName: "kube-api-access-6p8t8") pod "0180cef3-1322-4a69-a824-3fa1e2fc4de9" (UID: "0180cef3-1322-4a69-a824-3fa1e2fc4de9"). InnerVolumeSpecName "kube-api-access-6p8t8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.373097 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.387913 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0180cef3-1322-4a69-a824-3fa1e2fc4de9" (UID: "0180cef3-1322-4a69-a824-3fa1e2fc4de9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.392928 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-config-data" (OuterVolumeSpecName: "config-data") pod "0180cef3-1322-4a69-a824-3fa1e2fc4de9" (UID: "0180cef3-1322-4a69-a824-3fa1e2fc4de9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.450935 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.451889 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6p8t8\" (UniqueName: \"kubernetes.io/projected/0180cef3-1322-4a69-a824-3fa1e2fc4de9-kube-api-access-6p8t8\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.451901 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0180cef3-1322-4a69-a824-3fa1e2fc4de9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.451910 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0180cef3-1322-4a69-a824-3fa1e2fc4de9-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.455401 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.858356 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.946507 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0180cef3-1322-4a69-a824-3fa1e2fc4de9","Type":"ContainerDied","Data":"78cfe7ac0294fa293a975acbe1830d41086248be0401722e0dfead5d10c8f20d"} Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.946560 5200 scope.go:117] "RemoveContainer" containerID="e2f6cf12989b01dc9fb5623c1c74e99f6c35f90c63bb8868807947c16746e8ce" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.946678 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.972043 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a235b77d-17ac-4633-9030-940179d31fa9","Type":"ContainerDied","Data":"cf69c8719edf5b9189c26e9362487acc277ab83ecbe912d87efa8652d96d21de"} Nov 26 15:05:26 crc kubenswrapper[5200]: I1126 15:05:26.972221 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.000083 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c678887-cc1d-4694-9cd6-8fbacccacd07" path="/var/lib/kubelet/pods/2c678887-cc1d-4694-9cd6-8fbacccacd07/volumes" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.003016 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587cd81d-cae9-4cb3-b029-c9513934be26" path="/var/lib/kubelet/pods/587cd81d-cae9-4cb3-b029-c9513934be26/volumes" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.006270 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.006436 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ea5bf46-9089-43cc-9a63-552c2dbc5347","Type":"ContainerStarted","Data":"71c9760b2eb7ff89b4d1ecf75e8288684c311e5ceeaccd3799ff49fb19b3fe24"} Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.111914 5200 scope.go:117] "RemoveContainer" containerID="c3acc206e4f3310bb235932e9d2defbec84e2d81dea1df2b6587f519882225b7" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.148846 5200 scope.go:117] "RemoveContainer" containerID="4a42c3b223f3a3802d5e2db952fd3d6969fdf5998faac4b40638aa0d5b96d08b" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.220729 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.259443 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.287767 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.288970 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-log" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.288991 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-log" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.289001 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-api" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.289007 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-api" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.289035 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a235b77d-17ac-4633-9030-940179d31fa9" containerName="nova-scheduler-scheduler" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.289042 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a235b77d-17ac-4633-9030-940179d31fa9" containerName="nova-scheduler-scheduler" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.289234 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a235b77d-17ac-4633-9030-940179d31fa9" containerName="nova-scheduler-scheduler" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.289253 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-api" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.289264 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" containerName="nova-api-log" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.300890 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.304538 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.307138 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.381438 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-config-data\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.381785 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.382037 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk7q7\" (UniqueName: \"kubernetes.io/projected/97d1f719-18bb-48df-b324-d6037111809d-kube-api-access-xk7q7\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.490138 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk7q7\" (UniqueName: \"kubernetes.io/projected/97d1f719-18bb-48df-b324-d6037111809d-kube-api-access-xk7q7\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.490470 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-config-data\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.490537 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.497130 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-config-data\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.526585 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk7q7\" (UniqueName: \"kubernetes.io/projected/97d1f719-18bb-48df-b324-d6037111809d-kube-api-access-xk7q7\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.531750 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " pod="openstack/nova-scheduler-0" Nov 26 15:05:27 crc kubenswrapper[5200]: I1126 15:05:27.654495 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.025957 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ea5bf46-9089-43cc-9a63-552c2dbc5347","Type":"ContainerStarted","Data":"a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22"} Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.026317 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ea5bf46-9089-43cc-9a63-552c2dbc5347","Type":"ContainerStarted","Data":"d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c"} Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.031173 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bd5d470d-750f-4f42-a88a-1518b5cb9e15","Type":"ContainerStarted","Data":"30d1ec1ca96a0dc77e03930adddb940ee98f9568c6d06b8e02de88c48bf70493"} Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.031374 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bd5d470d-750f-4f42-a88a-1518b5cb9e15","Type":"ContainerStarted","Data":"a11cf786e379be74a247082939308f3b9c9d99ee3231c4a49a515d8cfa1b722d"} Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.031555 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.050531 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.050509743 podStartE2EDuration="3.050509743s" podCreationTimestamp="2025-11-26 15:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:28.048586552 +0000 UTC m=+5696.727788390" watchObservedRunningTime="2025-11-26 15:05:28.050509743 +0000 UTC m=+5696.729711571" Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.073437 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.073412818 podStartE2EDuration="3.073412818s" podCreationTimestamp="2025-11-26 15:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:28.069036502 +0000 UTC m=+5696.748238330" watchObservedRunningTime="2025-11-26 15:05:28.073412818 +0000 UTC m=+5696.752614636" Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.123260 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:05:28 crc kubenswrapper[5200]: I1126 15:05:28.982484 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a235b77d-17ac-4633-9030-940179d31fa9" path="/var/lib/kubelet/pods/a235b77d-17ac-4633-9030-940179d31fa9/volumes" Nov 26 15:05:29 crc kubenswrapper[5200]: I1126 15:05:29.043131 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"97d1f719-18bb-48df-b324-d6037111809d","Type":"ContainerStarted","Data":"5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736"} Nov 26 15:05:29 crc kubenswrapper[5200]: I1126 15:05:29.043219 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"97d1f719-18bb-48df-b324-d6037111809d","Type":"ContainerStarted","Data":"6a0823885d93e40061e26dd81b786ae9ec4c5d3de8c100df361d6db9edb4a9eb"} Nov 26 15:05:29 crc kubenswrapper[5200]: I1126 15:05:29.068806 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.068779082 podStartE2EDuration="2.068779082s" podCreationTimestamp="2025-11-26 15:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:29.068399612 +0000 UTC m=+5697.747601440" watchObservedRunningTime="2025-11-26 15:05:29.068779082 +0000 UTC m=+5697.747980900" Nov 26 15:05:29 crc kubenswrapper[5200]: I1126 15:05:29.249827 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:31 crc kubenswrapper[5200]: I1126 15:05:31.373907 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 15:05:31 crc kubenswrapper[5200]: I1126 15:05:31.374770 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 15:05:32 crc kubenswrapper[5200]: I1126 15:05:32.654989 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Nov 26 15:05:33 crc kubenswrapper[5200]: I1126 15:05:33.073078 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 26 15:05:33 crc kubenswrapper[5200]: I1126 15:05:33.154078 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:05:33 crc kubenswrapper[5200]: I1126 15:05:33.154150 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:05:34 crc kubenswrapper[5200]: I1126 15:05:34.081150 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 26 15:05:34 crc kubenswrapper[5200]: I1126 15:05:34.250231 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:34 crc kubenswrapper[5200]: I1126 15:05:34.271231 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:35 crc kubenswrapper[5200]: I1126 15:05:35.130647 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:05:36 crc kubenswrapper[5200]: I1126 15:05:36.374738 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:05:36 crc kubenswrapper[5200]: I1126 15:05:36.375238 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:05:37 crc kubenswrapper[5200]: I1126 15:05:37.380973 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:05:37 crc kubenswrapper[5200]: I1126 15:05:37.381320 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:05:37 crc kubenswrapper[5200]: I1126 15:05:37.387483 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:05:37 crc kubenswrapper[5200]: I1126 15:05:37.387772 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:05:37 crc kubenswrapper[5200]: I1126 15:05:37.415889 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:05:37 crc kubenswrapper[5200]: I1126 15:05:37.415890 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:05:37 crc kubenswrapper[5200]: I1126 15:05:37.657922 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 15:05:37 crc kubenswrapper[5200]: I1126 15:05:37.752332 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 15:05:38 crc kubenswrapper[5200]: I1126 15:05:38.185176 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 15:05:38 crc kubenswrapper[5200]: E1126 15:05:38.245314 5200 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.224:45000->38.102.83.224:42551: write tcp 38.102.83.224:45000->38.102.83.224:42551: write: connection reset by peer Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.512819 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.555217 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.555389 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.558233 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scheduler-config-data\"" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.640077 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6bafc74d-7236-4373-9132-c3959df67368-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.640155 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-scripts\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.640245 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.640358 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.640442 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrlft\" (UniqueName: \"kubernetes.io/projected/6bafc74d-7236-4373-9132-c3959df67368-kube-api-access-xrlft\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.640479 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.742784 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6bafc74d-7236-4373-9132-c3959df67368-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.742842 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-scripts\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.742899 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.742962 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.742955 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6bafc74d-7236-4373-9132-c3959df67368-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.743284 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrlft\" (UniqueName: \"kubernetes.io/projected/6bafc74d-7236-4373-9132-c3959df67368-kube-api-access-xrlft\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.743347 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.851605 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrlft\" (UniqueName: \"kubernetes.io/projected/6bafc74d-7236-4373-9132-c3959df67368-kube-api-access-xrlft\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.851612 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-scripts\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.851861 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.853194 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.857904 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:41 crc kubenswrapper[5200]: I1126 15:05:41.882935 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:05:42 crc kubenswrapper[5200]: W1126 15:05:42.426201 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bafc74d_7236_4373_9132_c3959df67368.slice/crio-4774e539996712730680d66bb5026bd8c36810a3baf0e50e706f676ea9065c19 WatchSource:0}: Error finding container 4774e539996712730680d66bb5026bd8c36810a3baf0e50e706f676ea9065c19: Status 404 returned error can't find the container with id 4774e539996712730680d66bb5026bd8c36810a3baf0e50e706f676ea9065c19 Nov 26 15:05:42 crc kubenswrapper[5200]: I1126 15:05:42.427982 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:42 crc kubenswrapper[5200]: E1126 15:05:42.500300 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1760254 actualBytes=10240 Nov 26 15:05:43 crc kubenswrapper[5200]: I1126 15:05:43.132063 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:43 crc kubenswrapper[5200]: I1126 15:05:43.132927 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api-log" containerID="cri-o://a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2" gracePeriod=30 Nov 26 15:05:43 crc kubenswrapper[5200]: I1126 15:05:43.133107 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api" containerID="cri-o://1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63" gracePeriod=30 Nov 26 15:05:43 crc kubenswrapper[5200]: I1126 15:05:43.249814 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bafc74d-7236-4373-9132-c3959df67368","Type":"ContainerStarted","Data":"4774e539996712730680d66bb5026bd8c36810a3baf0e50e706f676ea9065c19"} Nov 26 15:05:43 crc kubenswrapper[5200]: I1126 15:05:43.953680 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.001186 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.010642 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-volume-volume1-config-data\"" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.013220 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.108943 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109005 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109027 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgmdx\" (UniqueName: \"kubernetes.io/projected/e186001a-92bc-4767-916f-8755715db4b8-kube-api-access-qgmdx\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109056 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109082 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e186001a-92bc-4767-916f-8755715db4b8-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109105 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109129 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109150 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109181 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109218 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-run\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109260 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109283 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109335 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109376 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109436 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.109468 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.223416 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224205 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224232 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgmdx\" (UniqueName: \"kubernetes.io/projected/e186001a-92bc-4767-916f-8755715db4b8-kube-api-access-qgmdx\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224268 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224306 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e186001a-92bc-4767-916f-8755715db4b8-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224328 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224355 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224381 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224414 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224437 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-run\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224480 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224504 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224548 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224592 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224640 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.224720 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.225926 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.226773 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.226845 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.226888 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-run\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.226936 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.227176 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.231142 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.231888 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.231939 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.231980 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.232419 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e186001a-92bc-4767-916f-8755715db4b8-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.233053 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.242488 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.243336 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e186001a-92bc-4767-916f-8755715db4b8-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.257491 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgmdx\" (UniqueName: \"kubernetes.io/projected/e186001a-92bc-4767-916f-8755715db4b8-kube-api-access-qgmdx\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.260198 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e186001a-92bc-4767-916f-8755715db4b8-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e186001a-92bc-4767-916f-8755715db4b8\") " pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.307059 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bafc74d-7236-4373-9132-c3959df67368","Type":"ContainerStarted","Data":"eda3612f25d99c2ad11b6498564cfbba05dd5883094ecb2765bf277fb0ccd0d0"} Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.313959 5200 generic.go:358] "Generic (PLEG): container finished" podID="835e6a17-af89-438e-b46f-55d74c21d11e" containerID="a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2" exitCode=143 Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.314170 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"835e6a17-af89-438e-b46f-55d74c21d11e","Type":"ContainerDied","Data":"a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2"} Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.360373 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.638222 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.710648 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.711115 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.714805 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-backup-config-data\"" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.839807 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ab2371f-4366-4f69-9cdd-562c6581d05a-ceph\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.840207 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-config-data\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.840258 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.840289 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-dev\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.840393 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.840496 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qpgk\" (UniqueName: \"kubernetes.io/projected/9ab2371f-4366-4f69-9cdd-562c6581d05a-kube-api-access-7qpgk\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.840610 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-lib-modules\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.840752 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.840844 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.841201 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-run\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.841259 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.841350 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.841424 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.841462 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-sys\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.841569 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.841626 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-scripts\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.901245 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.943748 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-config-data\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.943802 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.943824 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-dev\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.944251 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.944449 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qpgk\" (UniqueName: \"kubernetes.io/projected/9ab2371f-4366-4f69-9cdd-562c6581d05a-kube-api-access-7qpgk\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.944345 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.944320 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-dev\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.944776 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-lib-modules\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.944803 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-lib-modules\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.945074 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.945184 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.945320 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.945431 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.945593 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-run\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.945652 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-run\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.945850 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.945947 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.946094 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.946249 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.946379 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-sys\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.946318 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.946167 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.946447 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ab2371f-4366-4f69-9cdd-562c6581d05a-sys\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.946791 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.946845 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-scripts\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.947269 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ab2371f-4366-4f69-9cdd-562c6581d05a-ceph\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.951274 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.951330 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-scripts\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.951571 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.951984 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ab2371f-4366-4f69-9cdd-562c6581d05a-ceph\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.954901 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab2371f-4366-4f69-9cdd-562c6581d05a-config-data\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:44 crc kubenswrapper[5200]: I1126 15:05:44.966354 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qpgk\" (UniqueName: \"kubernetes.io/projected/9ab2371f-4366-4f69-9cdd-562c6581d05a-kube-api-access-7qpgk\") pod \"cinder-backup-0\" (UID: \"9ab2371f-4366-4f69-9cdd-562c6581d05a\") " pod="openstack/cinder-backup-0" Nov 26 15:05:45 crc kubenswrapper[5200]: I1126 15:05:45.042956 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 26 15:05:45 crc kubenswrapper[5200]: I1126 15:05:45.327275 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bafc74d-7236-4373-9132-c3959df67368","Type":"ContainerStarted","Data":"c3a03cd32db99d1120b66258fe5386c0a663b75015043c3235374b1e63f89b5e"} Nov 26 15:05:45 crc kubenswrapper[5200]: I1126 15:05:45.329982 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e186001a-92bc-4767-916f-8755715db4b8","Type":"ContainerStarted","Data":"1824cbed857f54acc6192a37cdffbb67a7a001cf14021fef2ebfcc7bdf321d2b"} Nov 26 15:05:45 crc kubenswrapper[5200]: I1126 15:05:45.351552 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.351535514 podStartE2EDuration="4.351535514s" podCreationTimestamp="2025-11-26 15:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:45.346386348 +0000 UTC m=+5714.025588186" watchObservedRunningTime="2025-11-26 15:05:45.351535514 +0000 UTC m=+5714.030737332" Nov 26 15:05:45 crc kubenswrapper[5200]: I1126 15:05:45.610438 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 26 15:05:46 crc kubenswrapper[5200]: I1126 15:05:46.312738 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.79:8776/healthcheck\": read tcp 10.217.0.2:46888->10.217.1.79:8776: read: connection reset by peer" Nov 26 15:05:46 crc kubenswrapper[5200]: I1126 15:05:46.343612 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9ab2371f-4366-4f69-9cdd-562c6581d05a","Type":"ContainerStarted","Data":"b5cf5d7533838e35673831d6cf06e7d5573e9831352fe3da2b8642584ec37aa0"} Nov 26 15:05:46 crc kubenswrapper[5200]: I1126 15:05:46.379710 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 15:05:46 crc kubenswrapper[5200]: I1126 15:05:46.381880 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 15:05:46 crc kubenswrapper[5200]: I1126 15:05:46.383979 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 15:05:46 crc kubenswrapper[5200]: I1126 15:05:46.883455 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.312008 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.380146 5200 generic.go:358] "Generic (PLEG): container finished" podID="835e6a17-af89-438e-b46f-55d74c21d11e" containerID="1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63" exitCode=0 Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.380386 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.380566 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"835e6a17-af89-438e-b46f-55d74c21d11e","Type":"ContainerDied","Data":"1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63"} Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.380622 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"835e6a17-af89-438e-b46f-55d74c21d11e","Type":"ContainerDied","Data":"8bcae0c23d9e439a7a2bcc2347e9729c5119376b42f844ec40065d8bcb9c24f7"} Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.380642 5200 scope.go:117] "RemoveContainer" containerID="1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.384548 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.408394 5200 scope.go:117] "RemoveContainer" containerID="a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.413523 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835e6a17-af89-438e-b46f-55d74c21d11e-logs\") pod \"835e6a17-af89-438e-b46f-55d74c21d11e\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.413658 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-scripts\") pod \"835e6a17-af89-438e-b46f-55d74c21d11e\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.414490 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/835e6a17-af89-438e-b46f-55d74c21d11e-logs" (OuterVolumeSpecName: "logs") pod "835e6a17-af89-438e-b46f-55d74c21d11e" (UID: "835e6a17-af89-438e-b46f-55d74c21d11e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.413680 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data-custom\") pod \"835e6a17-af89-438e-b46f-55d74c21d11e\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.415792 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/835e6a17-af89-438e-b46f-55d74c21d11e-etc-machine-id\") pod \"835e6a17-af89-438e-b46f-55d74c21d11e\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.415845 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data\") pod \"835e6a17-af89-438e-b46f-55d74c21d11e\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.415965 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj8xl\" (UniqueName: \"kubernetes.io/projected/835e6a17-af89-438e-b46f-55d74c21d11e-kube-api-access-kj8xl\") pod \"835e6a17-af89-438e-b46f-55d74c21d11e\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.415990 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-combined-ca-bundle\") pod \"835e6a17-af89-438e-b46f-55d74c21d11e\" (UID: \"835e6a17-af89-438e-b46f-55d74c21d11e\") " Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.416390 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835e6a17-af89-438e-b46f-55d74c21d11e-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.416966 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/835e6a17-af89-438e-b46f-55d74c21d11e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "835e6a17-af89-438e-b46f-55d74c21d11e" (UID: "835e6a17-af89-438e-b46f-55d74c21d11e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.435896 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "835e6a17-af89-438e-b46f-55d74c21d11e" (UID: "835e6a17-af89-438e-b46f-55d74c21d11e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.440880 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-scripts" (OuterVolumeSpecName: "scripts") pod "835e6a17-af89-438e-b46f-55d74c21d11e" (UID: "835e6a17-af89-438e-b46f-55d74c21d11e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.464001 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835e6a17-af89-438e-b46f-55d74c21d11e-kube-api-access-kj8xl" (OuterVolumeSpecName: "kube-api-access-kj8xl") pod "835e6a17-af89-438e-b46f-55d74c21d11e" (UID: "835e6a17-af89-438e-b46f-55d74c21d11e"). InnerVolumeSpecName "kube-api-access-kj8xl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.507073 5200 scope.go:117] "RemoveContainer" containerID="1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.519903 5200 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/835e6a17-af89-438e-b46f-55d74c21d11e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.519946 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kj8xl\" (UniqueName: \"kubernetes.io/projected/835e6a17-af89-438e-b46f-55d74c21d11e-kube-api-access-kj8xl\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.519961 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.519976 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.520944 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "835e6a17-af89-438e-b46f-55d74c21d11e" (UID: "835e6a17-af89-438e-b46f-55d74c21d11e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:47 crc kubenswrapper[5200]: E1126 15:05:47.521203 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63\": container with ID starting with 1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63 not found: ID does not exist" containerID="1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.521385 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63"} err="failed to get container status \"1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63\": rpc error: code = NotFound desc = could not find container \"1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63\": container with ID starting with 1419d4a7b6e8bdfb50f2e0213401a6808fbb9e43653710f398ed35ce956c0e63 not found: ID does not exist" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.521426 5200 scope.go:117] "RemoveContainer" containerID="a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2" Nov 26 15:05:47 crc kubenswrapper[5200]: E1126 15:05:47.530927 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2\": container with ID starting with a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2 not found: ID does not exist" containerID="a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.530985 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2"} err="failed to get container status \"a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2\": rpc error: code = NotFound desc = could not find container \"a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2\": container with ID starting with a8497e0c8d299b9434822a84a6aa63bf42929d9d239c01bc3a5f12223f2d0cd2 not found: ID does not exist" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.543218 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data" (OuterVolumeSpecName: "config-data") pod "835e6a17-af89-438e-b46f-55d74c21d11e" (UID: "835e6a17-af89-438e-b46f-55d74c21d11e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.623946 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.624000 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835e6a17-af89-438e-b46f-55d74c21d11e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.731315 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.740045 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.769778 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.771008 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.771031 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.771070 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api-log" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.771077 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api-log" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.771272 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.771303 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" containerName="cinder-api-log" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.843720 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.843881 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:05:47 crc kubenswrapper[5200]: I1126 15:05:47.848656 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-api-config-data\"" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.032200 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-scripts\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.032566 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-config-data\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.032646 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1a1082-4802-475f-ba21-870d30cb5247-logs\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.032717 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c1a1082-4802-475f-ba21-870d30cb5247-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.032777 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-config-data-custom\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.032809 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g5zf\" (UniqueName: \"kubernetes.io/projected/9c1a1082-4802-475f-ba21-870d30cb5247-kube-api-access-2g5zf\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.032854 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.135778 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-scripts\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.136207 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-config-data\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.136550 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1a1082-4802-475f-ba21-870d30cb5247-logs\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.136774 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c1a1082-4802-475f-ba21-870d30cb5247-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.136941 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-config-data-custom\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.136988 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2g5zf\" (UniqueName: \"kubernetes.io/projected/9c1a1082-4802-475f-ba21-870d30cb5247-kube-api-access-2g5zf\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.137078 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.137479 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1a1082-4802-475f-ba21-870d30cb5247-logs\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.137621 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c1a1082-4802-475f-ba21-870d30cb5247-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.143279 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-config-data-custom\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.143756 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.144356 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-scripts\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.146480 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1a1082-4802-475f-ba21-870d30cb5247-config-data\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.178191 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g5zf\" (UniqueName: \"kubernetes.io/projected/9c1a1082-4802-475f-ba21-870d30cb5247-kube-api-access-2g5zf\") pod \"cinder-api-0\" (UID: \"9c1a1082-4802-475f-ba21-870d30cb5247\") " pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.299779 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.397770 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e186001a-92bc-4767-916f-8755715db4b8","Type":"ContainerStarted","Data":"7c5b962bf7882e606c18c9f692ffc991c0bf776bfbe1b2ef738cd66d808bde41"} Nov 26 15:05:48 crc kubenswrapper[5200]: I1126 15:05:48.991086 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835e6a17-af89-438e-b46f-55d74c21d11e" path="/var/lib/kubelet/pods/835e6a17-af89-438e-b46f-55d74c21d11e/volumes" Nov 26 15:05:49 crc kubenswrapper[5200]: I1126 15:05:49.117486 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:05:49 crc kubenswrapper[5200]: W1126 15:05:49.125079 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c1a1082_4802_475f_ba21_870d30cb5247.slice/crio-194ca94ca7ba432753f73a8310922934f60a396990a5b01d6e05994089d8d043 WatchSource:0}: Error finding container 194ca94ca7ba432753f73a8310922934f60a396990a5b01d6e05994089d8d043: Status 404 returned error can't find the container with id 194ca94ca7ba432753f73a8310922934f60a396990a5b01d6e05994089d8d043 Nov 26 15:05:49 crc kubenswrapper[5200]: I1126 15:05:49.411146 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c1a1082-4802-475f-ba21-870d30cb5247","Type":"ContainerStarted","Data":"194ca94ca7ba432753f73a8310922934f60a396990a5b01d6e05994089d8d043"} Nov 26 15:05:49 crc kubenswrapper[5200]: I1126 15:05:49.417075 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9ab2371f-4366-4f69-9cdd-562c6581d05a","Type":"ContainerStarted","Data":"d9783525027aec2610897590ad77f97b4d8a4f55891696ec26fdb881893c268b"} Nov 26 15:05:49 crc kubenswrapper[5200]: I1126 15:05:49.417111 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9ab2371f-4366-4f69-9cdd-562c6581d05a","Type":"ContainerStarted","Data":"7d0f9dc3e184f3a09b9b531f338d0e9547f3e9f274114f805d590df1aec1ed99"} Nov 26 15:05:49 crc kubenswrapper[5200]: I1126 15:05:49.420354 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e186001a-92bc-4767-916f-8755715db4b8","Type":"ContainerStarted","Data":"4acad2ac2af99dffc67e40f9705fd894cfc569dd4199827ec140ca6088c5baf6"} Nov 26 15:05:49 crc kubenswrapper[5200]: I1126 15:05:49.454324 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.480987549 podStartE2EDuration="5.454300026s" podCreationTimestamp="2025-11-26 15:05:44 +0000 UTC" firstStartedPulling="2025-11-26 15:05:45.628020683 +0000 UTC m=+5714.307222501" lastFinishedPulling="2025-11-26 15:05:48.60133315 +0000 UTC m=+5717.280534978" observedRunningTime="2025-11-26 15:05:49.440454091 +0000 UTC m=+5718.119655919" watchObservedRunningTime="2025-11-26 15:05:49.454300026 +0000 UTC m=+5718.133501844" Nov 26 15:05:49 crc kubenswrapper[5200]: I1126 15:05:49.480324 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.173692075 podStartE2EDuration="6.480297793s" podCreationTimestamp="2025-11-26 15:05:43 +0000 UTC" firstStartedPulling="2025-11-26 15:05:44.906996039 +0000 UTC m=+5713.586197847" lastFinishedPulling="2025-11-26 15:05:47.213601747 +0000 UTC m=+5715.892803565" observedRunningTime="2025-11-26 15:05:49.464337111 +0000 UTC m=+5718.143538959" watchObservedRunningTime="2025-11-26 15:05:49.480297793 +0000 UTC m=+5718.159499611" Nov 26 15:05:50 crc kubenswrapper[5200]: I1126 15:05:50.043915 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Nov 26 15:05:50 crc kubenswrapper[5200]: I1126 15:05:50.447232 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c1a1082-4802-475f-ba21-870d30cb5247","Type":"ContainerStarted","Data":"96740523db068de8ed19a81347932a7046e35ce773e56226bcefffa708db596d"} Nov 26 15:05:51 crc kubenswrapper[5200]: I1126 15:05:51.463126 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9c1a1082-4802-475f-ba21-870d30cb5247","Type":"ContainerStarted","Data":"98d15fd393bf61e77f8c122d65882896f4d39eb8da8052b28f17e8d24cd4d310"} Nov 26 15:05:51 crc kubenswrapper[5200]: I1126 15:05:51.463725 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/cinder-api-0" Nov 26 15:05:51 crc kubenswrapper[5200]: I1126 15:05:51.513145 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.513119823 podStartE2EDuration="4.513119823s" podCreationTimestamp="2025-11-26 15:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:51.48495258 +0000 UTC m=+5720.164154478" watchObservedRunningTime="2025-11-26 15:05:51.513119823 +0000 UTC m=+5720.192321641" Nov 26 15:05:52 crc kubenswrapper[5200]: I1126 15:05:52.133477 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 26 15:05:52 crc kubenswrapper[5200]: I1126 15:05:52.177744 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:52 crc kubenswrapper[5200]: I1126 15:05:52.470151 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6bafc74d-7236-4373-9132-c3959df67368" containerName="probe" containerID="cri-o://c3a03cd32db99d1120b66258fe5386c0a663b75015043c3235374b1e63f89b5e" gracePeriod=30 Nov 26 15:05:52 crc kubenswrapper[5200]: I1126 15:05:52.470079 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6bafc74d-7236-4373-9132-c3959df67368" containerName="cinder-scheduler" containerID="cri-o://eda3612f25d99c2ad11b6498564cfbba05dd5883094ecb2765bf277fb0ccd0d0" gracePeriod=30 Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.494154 5200 generic.go:358] "Generic (PLEG): container finished" podID="6bafc74d-7236-4373-9132-c3959df67368" containerID="c3a03cd32db99d1120b66258fe5386c0a663b75015043c3235374b1e63f89b5e" exitCode=0 Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.494791 5200 generic.go:358] "Generic (PLEG): container finished" podID="6bafc74d-7236-4373-9132-c3959df67368" containerID="eda3612f25d99c2ad11b6498564cfbba05dd5883094ecb2765bf277fb0ccd0d0" exitCode=0 Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.496178 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bafc74d-7236-4373-9132-c3959df67368","Type":"ContainerDied","Data":"c3a03cd32db99d1120b66258fe5386c0a663b75015043c3235374b1e63f89b5e"} Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.496258 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bafc74d-7236-4373-9132-c3959df67368","Type":"ContainerDied","Data":"eda3612f25d99c2ad11b6498564cfbba05dd5883094ecb2765bf277fb0ccd0d0"} Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.724593 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.792154 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data-custom\") pod \"6bafc74d-7236-4373-9132-c3959df67368\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.792436 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-combined-ca-bundle\") pod \"6bafc74d-7236-4373-9132-c3959df67368\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.792507 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-scripts\") pod \"6bafc74d-7236-4373-9132-c3959df67368\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.792549 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrlft\" (UniqueName: \"kubernetes.io/projected/6bafc74d-7236-4373-9132-c3959df67368-kube-api-access-xrlft\") pod \"6bafc74d-7236-4373-9132-c3959df67368\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.793952 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6bafc74d-7236-4373-9132-c3959df67368-etc-machine-id\") pod \"6bafc74d-7236-4373-9132-c3959df67368\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.794023 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bafc74d-7236-4373-9132-c3959df67368-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6bafc74d-7236-4373-9132-c3959df67368" (UID: "6bafc74d-7236-4373-9132-c3959df67368"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.794055 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data\") pod \"6bafc74d-7236-4373-9132-c3959df67368\" (UID: \"6bafc74d-7236-4373-9132-c3959df67368\") " Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.795675 5200 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6bafc74d-7236-4373-9132-c3959df67368-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.800616 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bafc74d-7236-4373-9132-c3959df67368-kube-api-access-xrlft" (OuterVolumeSpecName: "kube-api-access-xrlft") pod "6bafc74d-7236-4373-9132-c3959df67368" (UID: "6bafc74d-7236-4373-9132-c3959df67368"). InnerVolumeSpecName "kube-api-access-xrlft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.802379 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-scripts" (OuterVolumeSpecName: "scripts") pod "6bafc74d-7236-4373-9132-c3959df67368" (UID: "6bafc74d-7236-4373-9132-c3959df67368"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.802538 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6bafc74d-7236-4373-9132-c3959df67368" (UID: "6bafc74d-7236-4373-9132-c3959df67368"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.873557 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bafc74d-7236-4373-9132-c3959df67368" (UID: "6bafc74d-7236-4373-9132-c3959df67368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.897612 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.897652 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.897665 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xrlft\" (UniqueName: \"kubernetes.io/projected/6bafc74d-7236-4373-9132-c3959df67368-kube-api-access-xrlft\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.897703 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.928663 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data" (OuterVolumeSpecName: "config-data") pod "6bafc74d-7236-4373-9132-c3959df67368" (UID: "6bafc74d-7236-4373-9132-c3959df67368"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:05:53 crc kubenswrapper[5200]: I1126 15:05:53.999780 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bafc74d-7236-4373-9132-c3959df67368-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.361191 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.524166 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bafc74d-7236-4373-9132-c3959df67368","Type":"ContainerDied","Data":"4774e539996712730680d66bb5026bd8c36810a3baf0e50e706f676ea9065c19"} Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.524200 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.524266 5200 scope.go:117] "RemoveContainer" containerID="c3a03cd32db99d1120b66258fe5386c0a663b75015043c3235374b1e63f89b5e" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.574723 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.595014 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.597836 5200 scope.go:117] "RemoveContainer" containerID="eda3612f25d99c2ad11b6498564cfbba05dd5883094ecb2765bf277fb0ccd0d0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.604113 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.605748 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bafc74d-7236-4373-9132-c3959df67368" containerName="cinder-scheduler" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.605773 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bafc74d-7236-4373-9132-c3959df67368" containerName="cinder-scheduler" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.605804 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bafc74d-7236-4373-9132-c3959df67368" containerName="probe" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.605809 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bafc74d-7236-4373-9132-c3959df67368" containerName="probe" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.606011 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bafc74d-7236-4373-9132-c3959df67368" containerName="probe" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.606030 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bafc74d-7236-4373-9132-c3959df67368" containerName="cinder-scheduler" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.621880 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.622803 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.625218 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scheduler-config-data\"" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.629227 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.717700 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.717780 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.717844 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlxzs\" (UniqueName: \"kubernetes.io/projected/72e329ae-c511-4991-9788-3c303b795651-kube-api-access-mlxzs\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.717892 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-scripts\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.717913 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72e329ae-c511-4991-9788-3c303b795651-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.717980 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-config-data\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.820062 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-config-data\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.820181 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.820238 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.820297 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlxzs\" (UniqueName: \"kubernetes.io/projected/72e329ae-c511-4991-9788-3c303b795651-kube-api-access-mlxzs\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.820862 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-scripts\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.820904 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72e329ae-c511-4991-9788-3c303b795651-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.821022 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72e329ae-c511-4991-9788-3c303b795651-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.826325 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.828820 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.829998 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-config-data\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.840354 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlxzs\" (UniqueName: \"kubernetes.io/projected/72e329ae-c511-4991-9788-3c303b795651-kube-api-access-mlxzs\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.842725 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72e329ae-c511-4991-9788-3c303b795651-scripts\") pod \"cinder-scheduler-0\" (UID: \"72e329ae-c511-4991-9788-3c303b795651\") " pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.961796 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:05:54 crc kubenswrapper[5200]: I1126 15:05:54.988778 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bafc74d-7236-4373-9132-c3959df67368" path="/var/lib/kubelet/pods/6bafc74d-7236-4373-9132-c3959df67368/volumes" Nov 26 15:05:55 crc kubenswrapper[5200]: I1126 15:05:55.294995 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Nov 26 15:05:55 crc kubenswrapper[5200]: I1126 15:05:55.449257 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:05:55 crc kubenswrapper[5200]: I1126 15:05:55.540567 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72e329ae-c511-4991-9788-3c303b795651","Type":"ContainerStarted","Data":"a4ac4a08664070d7e9606b81132aa2ac6fcf55c75caa6a8b392ea8a553cbbd9c"} Nov 26 15:05:56 crc kubenswrapper[5200]: I1126 15:05:56.557729 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72e329ae-c511-4991-9788-3c303b795651","Type":"ContainerStarted","Data":"0a5628a5c8faa915241290555b1967786f1e3bf5f6397956d8e34996e566221d"} Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.096362 5200 pod_container_manager_linux.go:217] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod0180cef3-1322-4a69-a824-3fa1e2fc4de9"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod0180cef3-1322-4a69-a824-3fa1e2fc4de9] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0180cef3_1322_4a69_a824_3fa1e2fc4de9.slice" Nov 26 15:05:57 crc kubenswrapper[5200]: E1126 15:05:57.096730 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod0180cef3-1322-4a69-a824-3fa1e2fc4de9] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod0180cef3-1322-4a69-a824-3fa1e2fc4de9] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0180cef3_1322_4a69_a824_3fa1e2fc4de9.slice" pod="openstack/nova-api-0" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.568114 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72e329ae-c511-4991-9788-3c303b795651","Type":"ContainerStarted","Data":"5571a582be1b11a500ff8470eae0ff3e660101afb670f8be5f3050c19465f13f"} Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.568140 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.626387 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.626335705 podStartE2EDuration="3.626335705s" podCreationTimestamp="2025-11-26 15:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:57.609988774 +0000 UTC m=+5726.289190622" watchObservedRunningTime="2025-11-26 15:05:57.626335705 +0000 UTC m=+5726.305537543" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.639866 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.656836 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.685871 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.696376 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.698082 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.731530 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.799186 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-config-data\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.799623 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.800007 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-logs\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.800078 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff7zz\" (UniqueName: \"kubernetes.io/projected/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-kube-api-access-ff7zz\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.902016 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-logs\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.902084 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ff7zz\" (UniqueName: \"kubernetes.io/projected/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-kube-api-access-ff7zz\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.902172 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-config-data\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.902242 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.902628 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-logs\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.918406 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.918497 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-config-data\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:57 crc kubenswrapper[5200]: I1126 15:05:57.937858 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff7zz\" (UniqueName: \"kubernetes.io/projected/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-kube-api-access-ff7zz\") pod \"nova-api-0\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " pod="openstack/nova-api-0" Nov 26 15:05:58 crc kubenswrapper[5200]: I1126 15:05:58.041462 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:05:58 crc kubenswrapper[5200]: I1126 15:05:58.450035 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:05:58 crc kubenswrapper[5200]: I1126 15:05:58.580433 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9","Type":"ContainerStarted","Data":"8eededf9eecc6b29afb08347c9ec616704844d74433e4d366a49369299092248"} Nov 26 15:05:59 crc kubenswrapper[5200]: I1126 15:05:59.014610 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0180cef3-1322-4a69-a824-3fa1e2fc4de9" path="/var/lib/kubelet/pods/0180cef3-1322-4a69-a824-3fa1e2fc4de9/volumes" Nov 26 15:05:59 crc kubenswrapper[5200]: I1126 15:05:59.594420 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9","Type":"ContainerStarted","Data":"c373a82adbc473c635db51e1820ba54fe6a8f7fc766776e3c21a96eae29830b0"} Nov 26 15:05:59 crc kubenswrapper[5200]: I1126 15:05:59.595548 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9","Type":"ContainerStarted","Data":"77635436a0c57c13afcc0831d7957c372f2fb1e1998f6eabe000a747c9c668dc"} Nov 26 15:05:59 crc kubenswrapper[5200]: I1126 15:05:59.617912 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.617887117 podStartE2EDuration="2.617887117s" podCreationTimestamp="2025-11-26 15:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:59.615621998 +0000 UTC m=+5728.294823856" watchObservedRunningTime="2025-11-26 15:05:59.617887117 +0000 UTC m=+5728.297088945" Nov 26 15:05:59 crc kubenswrapper[5200]: I1126 15:05:59.962144 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 26 15:06:00 crc kubenswrapper[5200]: I1126 15:06:00.296927 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 26 15:06:03 crc kubenswrapper[5200]: I1126 15:06:03.154256 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:06:03 crc kubenswrapper[5200]: I1126 15:06:03.154593 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:06:05 crc kubenswrapper[5200]: I1126 15:06:05.196700 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 26 15:06:08 crc kubenswrapper[5200]: I1126 15:06:08.047017 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:06:08 crc kubenswrapper[5200]: I1126 15:06:08.049578 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:06:09 crc kubenswrapper[5200]: I1126 15:06:09.088951 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.90:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:06:09 crc kubenswrapper[5200]: I1126 15:06:09.090979 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.90:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:06:18 crc kubenswrapper[5200]: I1126 15:06:18.045883 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 15:06:18 crc kubenswrapper[5200]: I1126 15:06:18.046993 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 15:06:18 crc kubenswrapper[5200]: I1126 15:06:18.047032 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 15:06:18 crc kubenswrapper[5200]: I1126 15:06:18.050673 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 15:06:18 crc kubenswrapper[5200]: I1126 15:06:18.789337 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 15:06:18 crc kubenswrapper[5200]: I1126 15:06:18.793479 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 15:06:24 crc kubenswrapper[5200]: I1126 15:06:24.836627 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7blf4"] Nov 26 15:06:24 crc kubenswrapper[5200]: I1126 15:06:24.846517 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:24 crc kubenswrapper[5200]: I1126 15:06:24.851270 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7blf4"] Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.005074 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-catalog-content\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.005151 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghvss\" (UniqueName: \"kubernetes.io/projected/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-kube-api-access-ghvss\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.005265 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-utilities\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.107467 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-catalog-content\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.107525 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghvss\" (UniqueName: \"kubernetes.io/projected/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-kube-api-access-ghvss\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.107600 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-utilities\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.108163 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-catalog-content\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.108176 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-utilities\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.128787 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghvss\" (UniqueName: \"kubernetes.io/projected/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-kube-api-access-ghvss\") pod \"certified-operators-7blf4\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.177911 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.636467 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7blf4"] Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.858498 5200 generic.go:358] "Generic (PLEG): container finished" podID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerID="1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1" exitCode=0 Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.858604 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7blf4" event={"ID":"0d3ab632-0d08-4069-8ed7-afe55e04a6c2","Type":"ContainerDied","Data":"1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1"} Nov 26 15:06:25 crc kubenswrapper[5200]: I1126 15:06:25.858749 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7blf4" event={"ID":"0d3ab632-0d08-4069-8ed7-afe55e04a6c2","Type":"ContainerStarted","Data":"2975f2e98127c656463c0a40c395b3d1fa68c6f5985308f298da32b06b59efbe"} Nov 26 15:06:26 crc kubenswrapper[5200]: I1126 15:06:26.872857 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7blf4" event={"ID":"0d3ab632-0d08-4069-8ed7-afe55e04a6c2","Type":"ContainerStarted","Data":"1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6"} Nov 26 15:06:27 crc kubenswrapper[5200]: I1126 15:06:27.886194 5200 generic.go:358] "Generic (PLEG): container finished" podID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerID="1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6" exitCode=0 Nov 26 15:06:27 crc kubenswrapper[5200]: I1126 15:06:27.886279 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7blf4" event={"ID":"0d3ab632-0d08-4069-8ed7-afe55e04a6c2","Type":"ContainerDied","Data":"1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6"} Nov 26 15:06:28 crc kubenswrapper[5200]: I1126 15:06:28.899022 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7blf4" event={"ID":"0d3ab632-0d08-4069-8ed7-afe55e04a6c2","Type":"ContainerStarted","Data":"cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72"} Nov 26 15:06:28 crc kubenswrapper[5200]: I1126 15:06:28.941858 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7blf4" podStartSLOduration=4.165942094 podStartE2EDuration="4.941826973s" podCreationTimestamp="2025-11-26 15:06:24 +0000 UTC" firstStartedPulling="2025-11-26 15:06:25.859684994 +0000 UTC m=+5754.538886812" lastFinishedPulling="2025-11-26 15:06:26.635569873 +0000 UTC m=+5755.314771691" observedRunningTime="2025-11-26 15:06:28.919542361 +0000 UTC m=+5757.598744199" watchObservedRunningTime="2025-11-26 15:06:28.941826973 +0000 UTC m=+5757.621028831" Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.154814 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.155230 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.155298 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.156238 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.156339 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" gracePeriod=600 Nov 26 15:06:33 crc kubenswrapper[5200]: E1126 15:06:33.297243 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.948159 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" exitCode=0 Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.948237 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd"} Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.948488 5200 scope.go:117] "RemoveContainer" containerID="c57089b0cc90614ea2b7d202afb3706d3131737053c127ff99d3fcebdf0010ac" Nov 26 15:06:33 crc kubenswrapper[5200]: I1126 15:06:33.949132 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:06:33 crc kubenswrapper[5200]: E1126 15:06:33.949601 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:06:35 crc kubenswrapper[5200]: I1126 15:06:35.178546 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:35 crc kubenswrapper[5200]: I1126 15:06:35.179186 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:35 crc kubenswrapper[5200]: I1126 15:06:35.234720 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:36 crc kubenswrapper[5200]: I1126 15:06:36.021808 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:36 crc kubenswrapper[5200]: I1126 15:06:36.077748 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7blf4"] Nov 26 15:06:37 crc kubenswrapper[5200]: I1126 15:06:37.996530 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7blf4" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerName="registry-server" containerID="cri-o://cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72" gracePeriod=2 Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.591536 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.718484 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-catalog-content\") pod \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.718650 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-utilities\") pod \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.718755 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghvss\" (UniqueName: \"kubernetes.io/projected/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-kube-api-access-ghvss\") pod \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\" (UID: \"0d3ab632-0d08-4069-8ed7-afe55e04a6c2\") " Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.720781 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-utilities" (OuterVolumeSpecName: "utilities") pod "0d3ab632-0d08-4069-8ed7-afe55e04a6c2" (UID: "0d3ab632-0d08-4069-8ed7-afe55e04a6c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.724833 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-kube-api-access-ghvss" (OuterVolumeSpecName: "kube-api-access-ghvss") pod "0d3ab632-0d08-4069-8ed7-afe55e04a6c2" (UID: "0d3ab632-0d08-4069-8ed7-afe55e04a6c2"). InnerVolumeSpecName "kube-api-access-ghvss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.747064 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d3ab632-0d08-4069-8ed7-afe55e04a6c2" (UID: "0d3ab632-0d08-4069-8ed7-afe55e04a6c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.830307 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.830353 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:38 crc kubenswrapper[5200]: I1126 15:06:38.830364 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghvss\" (UniqueName: \"kubernetes.io/projected/0d3ab632-0d08-4069-8ed7-afe55e04a6c2-kube-api-access-ghvss\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.010350 5200 generic.go:358] "Generic (PLEG): container finished" podID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerID="cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72" exitCode=0 Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.010417 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7blf4" event={"ID":"0d3ab632-0d08-4069-8ed7-afe55e04a6c2","Type":"ContainerDied","Data":"cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72"} Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.010459 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7blf4" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.010855 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7blf4" event={"ID":"0d3ab632-0d08-4069-8ed7-afe55e04a6c2","Type":"ContainerDied","Data":"2975f2e98127c656463c0a40c395b3d1fa68c6f5985308f298da32b06b59efbe"} Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.010886 5200 scope.go:117] "RemoveContainer" containerID="cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.041913 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7blf4"] Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.052841 5200 scope.go:117] "RemoveContainer" containerID="1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.053148 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7blf4"] Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.077675 5200 scope.go:117] "RemoveContainer" containerID="1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.122628 5200 scope.go:117] "RemoveContainer" containerID="cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72" Nov 26 15:06:39 crc kubenswrapper[5200]: E1126 15:06:39.123292 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72\": container with ID starting with cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72 not found: ID does not exist" containerID="cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.123347 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72"} err="failed to get container status \"cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72\": rpc error: code = NotFound desc = could not find container \"cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72\": container with ID starting with cc650d021e28cc908a64b5789ebc581e0fc558724933c856c163bf77f1991d72 not found: ID does not exist" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.123376 5200 scope.go:117] "RemoveContainer" containerID="1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6" Nov 26 15:06:39 crc kubenswrapper[5200]: E1126 15:06:39.123893 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6\": container with ID starting with 1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6 not found: ID does not exist" containerID="1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.123945 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6"} err="failed to get container status \"1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6\": rpc error: code = NotFound desc = could not find container \"1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6\": container with ID starting with 1baae606de27f7c196c8bca0c543d15d1caaac841f302ceaa78bcf9ce250c2d6 not found: ID does not exist" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.123976 5200 scope.go:117] "RemoveContainer" containerID="1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1" Nov 26 15:06:39 crc kubenswrapper[5200]: E1126 15:06:39.124441 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1\": container with ID starting with 1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1 not found: ID does not exist" containerID="1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1" Nov 26 15:06:39 crc kubenswrapper[5200]: I1126 15:06:39.124475 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1"} err="failed to get container status \"1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1\": rpc error: code = NotFound desc = could not find container \"1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1\": container with ID starting with 1550d14450133e089b74341275da3ccdcc2ada2732a1443b416bfcb2feb017d1 not found: ID does not exist" Nov 26 15:06:40 crc kubenswrapper[5200]: I1126 15:06:40.984388 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" path="/var/lib/kubelet/pods/0d3ab632-0d08-4069-8ed7-afe55e04a6c2/volumes" Nov 26 15:06:42 crc kubenswrapper[5200]: E1126 15:06:42.386254 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1791577 actualBytes=10240 Nov 26 15:06:48 crc kubenswrapper[5200]: I1126 15:06:48.970789 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:06:48 crc kubenswrapper[5200]: E1126 15:06:48.972110 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:07:03 crc kubenswrapper[5200]: I1126 15:07:03.970008 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:07:03 crc kubenswrapper[5200]: E1126 15:07:03.973249 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:07:18 crc kubenswrapper[5200]: I1126 15:07:18.969859 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:07:18 crc kubenswrapper[5200]: E1126 15:07:18.971369 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:07:31 crc kubenswrapper[5200]: I1126 15:07:31.075237 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a6e0-account-create-update-976w8"] Nov 26 15:07:31 crc kubenswrapper[5200]: I1126 15:07:31.086001 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-cwzk4"] Nov 26 15:07:31 crc kubenswrapper[5200]: I1126 15:07:31.096970 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a6e0-account-create-update-976w8"] Nov 26 15:07:31 crc kubenswrapper[5200]: I1126 15:07:31.105756 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-cwzk4"] Nov 26 15:07:32 crc kubenswrapper[5200]: I1126 15:07:32.982651 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d6650b-66a6-4341-b458-ada349675a6a" path="/var/lib/kubelet/pods/29d6650b-66a6-4341-b458-ada349675a6a/volumes" Nov 26 15:07:32 crc kubenswrapper[5200]: I1126 15:07:32.982885 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:07:32 crc kubenswrapper[5200]: E1126 15:07:32.983559 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:07:32 crc kubenswrapper[5200]: I1126 15:07:32.984179 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9bcfe0-f199-4c6c-8317-6beab93a44a4" path="/var/lib/kubelet/pods/9d9bcfe0-f199-4c6c-8317-6beab93a44a4/volumes" Nov 26 15:07:39 crc kubenswrapper[5200]: I1126 15:07:39.037385 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qctzs"] Nov 26 15:07:39 crc kubenswrapper[5200]: I1126 15:07:39.047084 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qctzs"] Nov 26 15:07:40 crc kubenswrapper[5200]: I1126 15:07:40.986441 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df158a9-9d0e-486f-a9e0-d9fe18c5993b" path="/var/lib/kubelet/pods/3df158a9-9d0e-486f-a9e0-d9fe18c5993b/volumes" Nov 26 15:07:42 crc kubenswrapper[5200]: E1126 15:07:42.420398 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1791581 actualBytes=10240 Nov 26 15:07:43 crc kubenswrapper[5200]: I1126 15:07:43.969606 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:07:43 crc kubenswrapper[5200]: E1126 15:07:43.970302 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.627194 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vqkcw"] Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.628425 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerName="extract-content" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.628446 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerName="extract-content" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.628461 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerName="extract-utilities" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.628467 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerName="extract-utilities" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.628507 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerName="registry-server" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.628514 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerName="registry-server" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.628723 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d3ab632-0d08-4069-8ed7-afe55e04a6c2" containerName="registry-server" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.634061 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.636735 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncontroller-ovncontroller-dockercfg-88kpm\"" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.636856 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-scripts\"" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.642221 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-t88m7"] Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.672797 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.705071 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-run-ovn\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.705137 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-log-ovn\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.712238 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-run\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.712329 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-scripts\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.712387 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljcsj\" (UniqueName: \"kubernetes.io/projected/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-kube-api-access-ljcsj\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.712468 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-etc-ovs\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.712623 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-run\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.712662 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzbzj\" (UniqueName: \"kubernetes.io/projected/9f1d74d7-e2c1-4e67-8d72-785570da51f1-kube-api-access-jzbzj\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.712728 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f1d74d7-e2c1-4e67-8d72-785570da51f1-scripts\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.712899 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-log\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.713016 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-lib\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.713246 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vqkcw"] Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.733727 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-t88m7"] Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814325 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-run\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814385 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzbzj\" (UniqueName: \"kubernetes.io/projected/9f1d74d7-e2c1-4e67-8d72-785570da51f1-kube-api-access-jzbzj\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814414 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f1d74d7-e2c1-4e67-8d72-785570da51f1-scripts\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814467 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-log\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814527 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-lib\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814575 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-run-ovn\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814917 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-log-ovn\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814778 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-log\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814975 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-run\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814807 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-lib\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.814958 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-run-ovn\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.815019 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-log-ovn\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.815063 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-var-run\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.815108 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-scripts\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.815150 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljcsj\" (UniqueName: \"kubernetes.io/projected/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-kube-api-access-ljcsj\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.815195 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-etc-ovs\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.815346 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-etc-ovs\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.816487 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f1d74d7-e2c1-4e67-8d72-785570da51f1-scripts\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.816666 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f1d74d7-e2c1-4e67-8d72-785570da51f1-var-run\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.817247 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-scripts\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.833270 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzbzj\" (UniqueName: \"kubernetes.io/projected/9f1d74d7-e2c1-4e67-8d72-785570da51f1-kube-api-access-jzbzj\") pod \"ovn-controller-ovs-t88m7\" (UID: \"9f1d74d7-e2c1-4e67-8d72-785570da51f1\") " pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.833514 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljcsj\" (UniqueName: \"kubernetes.io/projected/b2e3f416-2e99-40ef-b104-c2ef740b5a2b-kube-api-access-ljcsj\") pod \"ovn-controller-vqkcw\" (UID: \"b2e3f416-2e99-40ef-b104-c2ef740b5a2b\") " pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:46 crc kubenswrapper[5200]: I1126 15:07:46.979920 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:47 crc kubenswrapper[5200]: I1126 15:07:47.004935 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:47 crc kubenswrapper[5200]: I1126 15:07:47.485910 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vqkcw"] Nov 26 15:07:47 crc kubenswrapper[5200]: I1126 15:07:47.748794 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vqkcw" event={"ID":"b2e3f416-2e99-40ef-b104-c2ef740b5a2b","Type":"ContainerStarted","Data":"447181fa02b5690efcb5b7665aec61b22dca0b400814b6e7cfdfa90ae8e0ceac"} Nov 26 15:07:47 crc kubenswrapper[5200]: I1126 15:07:47.890054 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-t88m7"] Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.197546 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-wvc4w"] Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.209239 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.218155 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wvc4w"] Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.218674 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-metrics-config\"" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.246822 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/472b8ab3-bd19-4977-bf47-b2336521f28f-ovs-rundir\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.247195 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/472b8ab3-bd19-4977-bf47-b2336521f28f-ovn-rundir\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.247357 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/472b8ab3-bd19-4977-bf47-b2336521f28f-config\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.247483 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnvsl\" (UniqueName: \"kubernetes.io/projected/472b8ab3-bd19-4977-bf47-b2336521f28f-kube-api-access-rnvsl\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.349870 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/472b8ab3-bd19-4977-bf47-b2336521f28f-ovs-rundir\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.349970 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/472b8ab3-bd19-4977-bf47-b2336521f28f-ovn-rundir\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.350028 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/472b8ab3-bd19-4977-bf47-b2336521f28f-config\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.350055 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnvsl\" (UniqueName: \"kubernetes.io/projected/472b8ab3-bd19-4977-bf47-b2336521f28f-kube-api-access-rnvsl\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.350711 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/472b8ab3-bd19-4977-bf47-b2336521f28f-ovs-rundir\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.350765 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/472b8ab3-bd19-4977-bf47-b2336521f28f-ovn-rundir\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.351392 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/472b8ab3-bd19-4977-bf47-b2336521f28f-config\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.378634 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnvsl\" (UniqueName: \"kubernetes.io/projected/472b8ab3-bd19-4977-bf47-b2336521f28f-kube-api-access-rnvsl\") pod \"ovn-controller-metrics-wvc4w\" (UID: \"472b8ab3-bd19-4977-bf47-b2336521f28f\") " pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.531234 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wvc4w" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.765975 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t88m7" event={"ID":"9f1d74d7-e2c1-4e67-8d72-785570da51f1","Type":"ContainerStarted","Data":"82e0ca48ea395018ee242d9c9c009545619440453ec898a254b9f30067bc7781"} Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.767128 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t88m7" event={"ID":"9f1d74d7-e2c1-4e67-8d72-785570da51f1","Type":"ContainerStarted","Data":"0fa49412d02d9687c295c7b53e2a1c47ad5de1263213093c5af3ed0cb7e810c3"} Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.768741 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vqkcw" event={"ID":"b2e3f416-2e99-40ef-b104-c2ef740b5a2b","Type":"ContainerStarted","Data":"4666b5b59203a4d2780c6cac4a5d4cf28051d90204fc641798b08a3f33c71e66"} Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.768987 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-vqkcw" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.836357 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vqkcw" podStartSLOduration=2.836335361 podStartE2EDuration="2.836335361s" podCreationTimestamp="2025-11-26 15:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:48.821039065 +0000 UTC m=+5837.500240883" watchObservedRunningTime="2025-11-26 15:07:48.836335361 +0000 UTC m=+5837.515537179" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.877880 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-zk9wv"] Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.887894 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-zk9wv"] Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.888008 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.973812 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkd9s\" (UniqueName: \"kubernetes.io/projected/99476d44-82e6-494b-aa46-55e545bace8a-kube-api-access-zkd9s\") pod \"octavia-db-create-zk9wv\" (UID: \"99476d44-82e6-494b-aa46-55e545bace8a\") " pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:48 crc kubenswrapper[5200]: I1126 15:07:48.974157 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99476d44-82e6-494b-aa46-55e545bace8a-operator-scripts\") pod \"octavia-db-create-zk9wv\" (UID: \"99476d44-82e6-494b-aa46-55e545bace8a\") " pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.076611 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkd9s\" (UniqueName: \"kubernetes.io/projected/99476d44-82e6-494b-aa46-55e545bace8a-kube-api-access-zkd9s\") pod \"octavia-db-create-zk9wv\" (UID: \"99476d44-82e6-494b-aa46-55e545bace8a\") " pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.076810 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99476d44-82e6-494b-aa46-55e545bace8a-operator-scripts\") pod \"octavia-db-create-zk9wv\" (UID: \"99476d44-82e6-494b-aa46-55e545bace8a\") " pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.077644 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99476d44-82e6-494b-aa46-55e545bace8a-operator-scripts\") pod \"octavia-db-create-zk9wv\" (UID: \"99476d44-82e6-494b-aa46-55e545bace8a\") " pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.096623 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wvc4w"] Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.102602 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkd9s\" (UniqueName: \"kubernetes.io/projected/99476d44-82e6-494b-aa46-55e545bace8a-kube-api-access-zkd9s\") pod \"octavia-db-create-zk9wv\" (UID: \"99476d44-82e6-494b-aa46-55e545bace8a\") " pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:49 crc kubenswrapper[5200]: W1126 15:07:49.103260 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod472b8ab3_bd19_4977_bf47_b2336521f28f.slice/crio-542d27f06d8364a0953e2f8171b14689c79a4237f66c834fc7c1c51790724ba0 WatchSource:0}: Error finding container 542d27f06d8364a0953e2f8171b14689c79a4237f66c834fc7c1c51790724ba0: Status 404 returned error can't find the container with id 542d27f06d8364a0953e2f8171b14689c79a4237f66c834fc7c1c51790724ba0 Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.226732 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.723891 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-zk9wv"] Nov 26 15:07:49 crc kubenswrapper[5200]: W1126 15:07:49.730025 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99476d44_82e6_494b_aa46_55e545bace8a.slice/crio-071cc9f8bc8e77c735a1817acd6f7efc12b625f681ba60b94b6fb0c76fc6042a WatchSource:0}: Error finding container 071cc9f8bc8e77c735a1817acd6f7efc12b625f681ba60b94b6fb0c76fc6042a: Status 404 returned error can't find the container with id 071cc9f8bc8e77c735a1817acd6f7efc12b625f681ba60b94b6fb0c76fc6042a Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.755766 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-436f-account-create-update-2vstr"] Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.768697 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-436f-account-create-update-2vstr"] Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.768821 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.771206 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-db-secret\"" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.803997 5200 generic.go:358] "Generic (PLEG): container finished" podID="9f1d74d7-e2c1-4e67-8d72-785570da51f1" containerID="82e0ca48ea395018ee242d9c9c009545619440453ec898a254b9f30067bc7781" exitCode=0 Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.804296 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t88m7" event={"ID":"9f1d74d7-e2c1-4e67-8d72-785570da51f1","Type":"ContainerDied","Data":"82e0ca48ea395018ee242d9c9c009545619440453ec898a254b9f30067bc7781"} Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.811061 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-zk9wv" event={"ID":"99476d44-82e6-494b-aa46-55e545bace8a","Type":"ContainerStarted","Data":"071cc9f8bc8e77c735a1817acd6f7efc12b625f681ba60b94b6fb0c76fc6042a"} Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.818035 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wvc4w" event={"ID":"472b8ab3-bd19-4977-bf47-b2336521f28f","Type":"ContainerStarted","Data":"30c72a6ae48037797bf495263f2d35cf6ecbe1fd78a1b66ba8e1df6c1dbb3f50"} Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.818173 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wvc4w" event={"ID":"472b8ab3-bd19-4977-bf47-b2336521f28f","Type":"ContainerStarted","Data":"542d27f06d8364a0953e2f8171b14689c79a4237f66c834fc7c1c51790724ba0"} Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.875336 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-wvc4w" podStartSLOduration=1.875311273 podStartE2EDuration="1.875311273s" podCreationTimestamp="2025-11-26 15:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:49.842994544 +0000 UTC m=+5838.522196362" watchObservedRunningTime="2025-11-26 15:07:49.875311273 +0000 UTC m=+5838.554513091" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.895602 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515c1e67-f058-4858-a438-6efc983b5a03-operator-scripts\") pod \"octavia-436f-account-create-update-2vstr\" (UID: \"515c1e67-f058-4858-a438-6efc983b5a03\") " pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:49 crc kubenswrapper[5200]: I1126 15:07:49.895709 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gllz\" (UniqueName: \"kubernetes.io/projected/515c1e67-f058-4858-a438-6efc983b5a03-kube-api-access-4gllz\") pod \"octavia-436f-account-create-update-2vstr\" (UID: \"515c1e67-f058-4858-a438-6efc983b5a03\") " pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.000423 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515c1e67-f058-4858-a438-6efc983b5a03-operator-scripts\") pod \"octavia-436f-account-create-update-2vstr\" (UID: \"515c1e67-f058-4858-a438-6efc983b5a03\") " pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.000498 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gllz\" (UniqueName: \"kubernetes.io/projected/515c1e67-f058-4858-a438-6efc983b5a03-kube-api-access-4gllz\") pod \"octavia-436f-account-create-update-2vstr\" (UID: \"515c1e67-f058-4858-a438-6efc983b5a03\") " pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.001616 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515c1e67-f058-4858-a438-6efc983b5a03-operator-scripts\") pod \"octavia-436f-account-create-update-2vstr\" (UID: \"515c1e67-f058-4858-a438-6efc983b5a03\") " pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.021157 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gllz\" (UniqueName: \"kubernetes.io/projected/515c1e67-f058-4858-a438-6efc983b5a03-kube-api-access-4gllz\") pod \"octavia-436f-account-create-update-2vstr\" (UID: \"515c1e67-f058-4858-a438-6efc983b5a03\") " pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.117769 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.597633 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-436f-account-create-update-2vstr"] Nov 26 15:07:50 crc kubenswrapper[5200]: W1126 15:07:50.608193 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515c1e67_f058_4858_a438_6efc983b5a03.slice/crio-f3f0bc35762f606a3a69c0d0a2611b6e8eb0b296574bfab9be6bcda0d2a03c2c WatchSource:0}: Error finding container f3f0bc35762f606a3a69c0d0a2611b6e8eb0b296574bfab9be6bcda0d2a03c2c: Status 404 returned error can't find the container with id f3f0bc35762f606a3a69c0d0a2611b6e8eb0b296574bfab9be6bcda0d2a03c2c Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.829764 5200 generic.go:358] "Generic (PLEG): container finished" podID="99476d44-82e6-494b-aa46-55e545bace8a" containerID="01dbc8547202186d0bcaff29c4b0992d32225f8543c54be4cdc0efba2da80552" exitCode=0 Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.830269 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-zk9wv" event={"ID":"99476d44-82e6-494b-aa46-55e545bace8a","Type":"ContainerDied","Data":"01dbc8547202186d0bcaff29c4b0992d32225f8543c54be4cdc0efba2da80552"} Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.834083 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-436f-account-create-update-2vstr" event={"ID":"515c1e67-f058-4858-a438-6efc983b5a03","Type":"ContainerStarted","Data":"f3f0bc35762f606a3a69c0d0a2611b6e8eb0b296574bfab9be6bcda0d2a03c2c"} Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.841808 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t88m7" event={"ID":"9f1d74d7-e2c1-4e67-8d72-785570da51f1","Type":"ContainerStarted","Data":"94e17ffca83e0037f3fd0de97aae4a3fdb75591a6f7cdc11175be6ce00e12eb8"} Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.841876 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t88m7" event={"ID":"9f1d74d7-e2c1-4e67-8d72-785570da51f1","Type":"ContainerStarted","Data":"955d79ac27feade957e118b5daa28e7147941c210400ec77a9ddb11a6745cde4"} Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.843562 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:50 crc kubenswrapper[5200]: I1126 15:07:50.873482 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-t88m7" podStartSLOduration=4.873459779 podStartE2EDuration="4.873459779s" podCreationTimestamp="2025-11-26 15:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:50.864600784 +0000 UTC m=+5839.543802602" watchObservedRunningTime="2025-11-26 15:07:50.873459779 +0000 UTC m=+5839.552661597" Nov 26 15:07:51 crc kubenswrapper[5200]: I1126 15:07:51.854247 5200 generic.go:358] "Generic (PLEG): container finished" podID="515c1e67-f058-4858-a438-6efc983b5a03" containerID="a136e76e3fd617c7b63332621e8d5c878a67a6a5a17be83c77a7a992153a1abc" exitCode=0 Nov 26 15:07:51 crc kubenswrapper[5200]: I1126 15:07:51.854448 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-436f-account-create-update-2vstr" event={"ID":"515c1e67-f058-4858-a438-6efc983b5a03","Type":"ContainerDied","Data":"a136e76e3fd617c7b63332621e8d5c878a67a6a5a17be83c77a7a992153a1abc"} Nov 26 15:07:51 crc kubenswrapper[5200]: I1126 15:07:51.856465 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.059523 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xmnrq"] Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.071158 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xmnrq"] Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.205458 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.351175 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkd9s\" (UniqueName: \"kubernetes.io/projected/99476d44-82e6-494b-aa46-55e545bace8a-kube-api-access-zkd9s\") pod \"99476d44-82e6-494b-aa46-55e545bace8a\" (UID: \"99476d44-82e6-494b-aa46-55e545bace8a\") " Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.351427 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99476d44-82e6-494b-aa46-55e545bace8a-operator-scripts\") pod \"99476d44-82e6-494b-aa46-55e545bace8a\" (UID: \"99476d44-82e6-494b-aa46-55e545bace8a\") " Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.353171 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99476d44-82e6-494b-aa46-55e545bace8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99476d44-82e6-494b-aa46-55e545bace8a" (UID: "99476d44-82e6-494b-aa46-55e545bace8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.357643 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99476d44-82e6-494b-aa46-55e545bace8a-kube-api-access-zkd9s" (OuterVolumeSpecName: "kube-api-access-zkd9s") pod "99476d44-82e6-494b-aa46-55e545bace8a" (UID: "99476d44-82e6-494b-aa46-55e545bace8a"). InnerVolumeSpecName "kube-api-access-zkd9s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.454043 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkd9s\" (UniqueName: \"kubernetes.io/projected/99476d44-82e6-494b-aa46-55e545bace8a-kube-api-access-zkd9s\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.454306 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99476d44-82e6-494b-aa46-55e545bace8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.868467 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-zk9wv" Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.868453 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-zk9wv" event={"ID":"99476d44-82e6-494b-aa46-55e545bace8a","Type":"ContainerDied","Data":"071cc9f8bc8e77c735a1817acd6f7efc12b625f681ba60b94b6fb0c76fc6042a"} Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.869961 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="071cc9f8bc8e77c735a1817acd6f7efc12b625f681ba60b94b6fb0c76fc6042a" Nov 26 15:07:52 crc kubenswrapper[5200]: I1126 15:07:52.988953 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e3da23-c789-4052-ad28-f50b8b42cc52" path="/var/lib/kubelet/pods/f8e3da23-c789-4052-ad28-f50b8b42cc52/volumes" Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.276848 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.375608 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gllz\" (UniqueName: \"kubernetes.io/projected/515c1e67-f058-4858-a438-6efc983b5a03-kube-api-access-4gllz\") pod \"515c1e67-f058-4858-a438-6efc983b5a03\" (UID: \"515c1e67-f058-4858-a438-6efc983b5a03\") " Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.375812 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515c1e67-f058-4858-a438-6efc983b5a03-operator-scripts\") pod \"515c1e67-f058-4858-a438-6efc983b5a03\" (UID: \"515c1e67-f058-4858-a438-6efc983b5a03\") " Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.376459 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/515c1e67-f058-4858-a438-6efc983b5a03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "515c1e67-f058-4858-a438-6efc983b5a03" (UID: "515c1e67-f058-4858-a438-6efc983b5a03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.385365 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515c1e67-f058-4858-a438-6efc983b5a03-kube-api-access-4gllz" (OuterVolumeSpecName: "kube-api-access-4gllz") pod "515c1e67-f058-4858-a438-6efc983b5a03" (UID: "515c1e67-f058-4858-a438-6efc983b5a03"). InnerVolumeSpecName "kube-api-access-4gllz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.478333 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4gllz\" (UniqueName: \"kubernetes.io/projected/515c1e67-f058-4858-a438-6efc983b5a03-kube-api-access-4gllz\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.478364 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515c1e67-f058-4858-a438-6efc983b5a03-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.891973 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-436f-account-create-update-2vstr" event={"ID":"515c1e67-f058-4858-a438-6efc983b5a03","Type":"ContainerDied","Data":"f3f0bc35762f606a3a69c0d0a2611b6e8eb0b296574bfab9be6bcda0d2a03c2c"} Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.892020 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3f0bc35762f606a3a69c0d0a2611b6e8eb0b296574bfab9be6bcda0d2a03c2c" Nov 26 15:07:53 crc kubenswrapper[5200]: I1126 15:07:53.892137 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-436f-account-create-update-2vstr" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.441732 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-g9nxx"] Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.445342 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="515c1e67-f058-4858-a438-6efc983b5a03" containerName="mariadb-account-create-update" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.445393 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="515c1e67-f058-4858-a438-6efc983b5a03" containerName="mariadb-account-create-update" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.445471 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99476d44-82e6-494b-aa46-55e545bace8a" containerName="mariadb-database-create" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.445486 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="99476d44-82e6-494b-aa46-55e545bace8a" containerName="mariadb-database-create" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.446311 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="515c1e67-f058-4858-a438-6efc983b5a03" containerName="mariadb-account-create-update" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.446368 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="99476d44-82e6-494b-aa46-55e545bace8a" containerName="mariadb-database-create" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.455777 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.472415 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-g9nxx"] Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.525810 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sks\" (UniqueName: \"kubernetes.io/projected/49d1cce1-73da-4750-a358-0dc637e7026d-kube-api-access-87sks\") pod \"octavia-persistence-db-create-g9nxx\" (UID: \"49d1cce1-73da-4750-a358-0dc637e7026d\") " pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.526047 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49d1cce1-73da-4750-a358-0dc637e7026d-operator-scripts\") pod \"octavia-persistence-db-create-g9nxx\" (UID: \"49d1cce1-73da-4750-a358-0dc637e7026d\") " pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.629580 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49d1cce1-73da-4750-a358-0dc637e7026d-operator-scripts\") pod \"octavia-persistence-db-create-g9nxx\" (UID: \"49d1cce1-73da-4750-a358-0dc637e7026d\") " pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.630061 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87sks\" (UniqueName: \"kubernetes.io/projected/49d1cce1-73da-4750-a358-0dc637e7026d-kube-api-access-87sks\") pod \"octavia-persistence-db-create-g9nxx\" (UID: \"49d1cce1-73da-4750-a358-0dc637e7026d\") " pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.630477 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49d1cce1-73da-4750-a358-0dc637e7026d-operator-scripts\") pod \"octavia-persistence-db-create-g9nxx\" (UID: \"49d1cce1-73da-4750-a358-0dc637e7026d\") " pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.653712 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sks\" (UniqueName: \"kubernetes.io/projected/49d1cce1-73da-4750-a358-0dc637e7026d-kube-api-access-87sks\") pod \"octavia-persistence-db-create-g9nxx\" (UID: \"49d1cce1-73da-4750-a358-0dc637e7026d\") " pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.793424 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.946844 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-708c-account-create-update-p6m8q"] Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.957902 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.963055 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-708c-account-create-update-p6m8q"] Nov 26 15:07:55 crc kubenswrapper[5200]: I1126 15:07:55.965081 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-persistence-db-secret\"" Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.040107 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c787ca-1064-48e6-9360-b3d6e6321d86-operator-scripts\") pod \"octavia-708c-account-create-update-p6m8q\" (UID: \"11c787ca-1064-48e6-9360-b3d6e6321d86\") " pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.040373 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkz52\" (UniqueName: \"kubernetes.io/projected/11c787ca-1064-48e6-9360-b3d6e6321d86-kube-api-access-qkz52\") pod \"octavia-708c-account-create-update-p6m8q\" (UID: \"11c787ca-1064-48e6-9360-b3d6e6321d86\") " pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.142278 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c787ca-1064-48e6-9360-b3d6e6321d86-operator-scripts\") pod \"octavia-708c-account-create-update-p6m8q\" (UID: \"11c787ca-1064-48e6-9360-b3d6e6321d86\") " pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.142420 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkz52\" (UniqueName: \"kubernetes.io/projected/11c787ca-1064-48e6-9360-b3d6e6321d86-kube-api-access-qkz52\") pod \"octavia-708c-account-create-update-p6m8q\" (UID: \"11c787ca-1064-48e6-9360-b3d6e6321d86\") " pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.143252 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c787ca-1064-48e6-9360-b3d6e6321d86-operator-scripts\") pod \"octavia-708c-account-create-update-p6m8q\" (UID: \"11c787ca-1064-48e6-9360-b3d6e6321d86\") " pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.161703 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkz52\" (UniqueName: \"kubernetes.io/projected/11c787ca-1064-48e6-9360-b3d6e6321d86-kube-api-access-qkz52\") pod \"octavia-708c-account-create-update-p6m8q\" (UID: \"11c787ca-1064-48e6-9360-b3d6e6321d86\") " pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.284199 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.284332 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-g9nxx"] Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.759157 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-708c-account-create-update-p6m8q"] Nov 26 15:07:56 crc kubenswrapper[5200]: W1126 15:07:56.771135 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11c787ca_1064_48e6_9360_b3d6e6321d86.slice/crio-1f301b7d197ed0f8f35287953fb80729e8e888d4fc8969c020ffd6acb78ef31f WatchSource:0}: Error finding container 1f301b7d197ed0f8f35287953fb80729e8e888d4fc8969c020ffd6acb78ef31f: Status 404 returned error can't find the container with id 1f301b7d197ed0f8f35287953fb80729e8e888d4fc8969c020ffd6acb78ef31f Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.928648 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-708c-account-create-update-p6m8q" event={"ID":"11c787ca-1064-48e6-9360-b3d6e6321d86","Type":"ContainerStarted","Data":"1f301b7d197ed0f8f35287953fb80729e8e888d4fc8969c020ffd6acb78ef31f"} Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.931130 5200 generic.go:358] "Generic (PLEG): container finished" podID="49d1cce1-73da-4750-a358-0dc637e7026d" containerID="7c24a2c0fe463180eddd91c56dc8c84b5b19e7890e8297e6c1735a612e92e526" exitCode=0 Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.931256 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-g9nxx" event={"ID":"49d1cce1-73da-4750-a358-0dc637e7026d","Type":"ContainerDied","Data":"7c24a2c0fe463180eddd91c56dc8c84b5b19e7890e8297e6c1735a612e92e526"} Nov 26 15:07:56 crc kubenswrapper[5200]: I1126 15:07:56.931321 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-g9nxx" event={"ID":"49d1cce1-73da-4750-a358-0dc637e7026d","Type":"ContainerStarted","Data":"cf7e4ddc5ef13289df7e80804e55ac143cdd5f9ab7242318cc86ab7e6e57db1d"} Nov 26 15:07:57 crc kubenswrapper[5200]: I1126 15:07:57.949762 5200 generic.go:358] "Generic (PLEG): container finished" podID="11c787ca-1064-48e6-9360-b3d6e6321d86" containerID="91425ec885e2c8c75e89980389404ac7149e72fb82fa3b963a793ae3cea25aad" exitCode=0 Nov 26 15:07:57 crc kubenswrapper[5200]: I1126 15:07:57.949884 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-708c-account-create-update-p6m8q" event={"ID":"11c787ca-1064-48e6-9360-b3d6e6321d86","Type":"ContainerDied","Data":"91425ec885e2c8c75e89980389404ac7149e72fb82fa3b963a793ae3cea25aad"} Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.350803 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.405892 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49d1cce1-73da-4750-a358-0dc637e7026d-operator-scripts\") pod \"49d1cce1-73da-4750-a358-0dc637e7026d\" (UID: \"49d1cce1-73da-4750-a358-0dc637e7026d\") " Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.406023 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87sks\" (UniqueName: \"kubernetes.io/projected/49d1cce1-73da-4750-a358-0dc637e7026d-kube-api-access-87sks\") pod \"49d1cce1-73da-4750-a358-0dc637e7026d\" (UID: \"49d1cce1-73da-4750-a358-0dc637e7026d\") " Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.406974 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49d1cce1-73da-4750-a358-0dc637e7026d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49d1cce1-73da-4750-a358-0dc637e7026d" (UID: "49d1cce1-73da-4750-a358-0dc637e7026d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.411601 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d1cce1-73da-4750-a358-0dc637e7026d-kube-api-access-87sks" (OuterVolumeSpecName: "kube-api-access-87sks") pod "49d1cce1-73da-4750-a358-0dc637e7026d" (UID: "49d1cce1-73da-4750-a358-0dc637e7026d"). InnerVolumeSpecName "kube-api-access-87sks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.509231 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49d1cce1-73da-4750-a358-0dc637e7026d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.509552 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87sks\" (UniqueName: \"kubernetes.io/projected/49d1cce1-73da-4750-a358-0dc637e7026d-kube-api-access-87sks\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.967529 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-g9nxx" event={"ID":"49d1cce1-73da-4750-a358-0dc637e7026d","Type":"ContainerDied","Data":"cf7e4ddc5ef13289df7e80804e55ac143cdd5f9ab7242318cc86ab7e6e57db1d"} Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.967551 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-g9nxx" Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.968124 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf7e4ddc5ef13289df7e80804e55ac143cdd5f9ab7242318cc86ab7e6e57db1d" Nov 26 15:07:58 crc kubenswrapper[5200]: I1126 15:07:58.971128 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:07:58 crc kubenswrapper[5200]: E1126 15:07:58.972249 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.447142 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.535379 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c787ca-1064-48e6-9360-b3d6e6321d86-operator-scripts\") pod \"11c787ca-1064-48e6-9360-b3d6e6321d86\" (UID: \"11c787ca-1064-48e6-9360-b3d6e6321d86\") " Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.535597 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkz52\" (UniqueName: \"kubernetes.io/projected/11c787ca-1064-48e6-9360-b3d6e6321d86-kube-api-access-qkz52\") pod \"11c787ca-1064-48e6-9360-b3d6e6321d86\" (UID: \"11c787ca-1064-48e6-9360-b3d6e6321d86\") " Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.536301 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c787ca-1064-48e6-9360-b3d6e6321d86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11c787ca-1064-48e6-9360-b3d6e6321d86" (UID: "11c787ca-1064-48e6-9360-b3d6e6321d86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.543423 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c787ca-1064-48e6-9360-b3d6e6321d86-kube-api-access-qkz52" (OuterVolumeSpecName: "kube-api-access-qkz52") pod "11c787ca-1064-48e6-9360-b3d6e6321d86" (UID: "11c787ca-1064-48e6-9360-b3d6e6321d86"). InnerVolumeSpecName "kube-api-access-qkz52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.637771 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c787ca-1064-48e6-9360-b3d6e6321d86-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.637807 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkz52\" (UniqueName: \"kubernetes.io/projected/11c787ca-1064-48e6-9360-b3d6e6321d86-kube-api-access-qkz52\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.977337 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-708c-account-create-update-p6m8q" Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.977419 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-708c-account-create-update-p6m8q" event={"ID":"11c787ca-1064-48e6-9360-b3d6e6321d86","Type":"ContainerDied","Data":"1f301b7d197ed0f8f35287953fb80729e8e888d4fc8969c020ffd6acb78ef31f"} Nov 26 15:07:59 crc kubenswrapper[5200]: I1126 15:07:59.977721 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f301b7d197ed0f8f35287953fb80729e8e888d4fc8969c020ffd6acb78ef31f" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.923265 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-5cc8d89f58-4k72z"] Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.926601 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11c787ca-1064-48e6-9360-b3d6e6321d86" containerName="mariadb-account-create-update" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.926635 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c787ca-1064-48e6-9360-b3d6e6321d86" containerName="mariadb-account-create-update" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.926738 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49d1cce1-73da-4750-a358-0dc637e7026d" containerName="mariadb-database-create" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.926750 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d1cce1-73da-4750-a358-0dc637e7026d" containerName="mariadb-database-create" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.927050 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="49d1cce1-73da-4750-a358-0dc637e7026d" containerName="mariadb-database-create" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.927072 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="11c787ca-1064-48e6-9360-b3d6e6321d86" containerName="mariadb-account-create-update" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.950125 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-5cc8d89f58-4k72z"] Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.950332 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.956175 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-api-scripts\"" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.956278 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-octavia-dockercfg-8cttt\"" Nov 26 15:08:01 crc kubenswrapper[5200]: I1126 15:08:01.956629 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-api-config-data\"" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.105567 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-combined-ca-bundle\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.105646 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/97a21845-f322-4fd9-9e6a-2d988ecf13a1-octavia-run\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.105705 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/97a21845-f322-4fd9-9e6a-2d988ecf13a1-config-data-merged\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.105747 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-scripts\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.105824 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-config-data\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.208607 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-config-data\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.209579 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-combined-ca-bundle\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.209858 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/97a21845-f322-4fd9-9e6a-2d988ecf13a1-octavia-run\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.210079 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/97a21845-f322-4fd9-9e6a-2d988ecf13a1-config-data-merged\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.210293 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-scripts\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.210671 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/97a21845-f322-4fd9-9e6a-2d988ecf13a1-config-data-merged\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.211258 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/97a21845-f322-4fd9-9e6a-2d988ecf13a1-octavia-run\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.216742 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-scripts\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.216840 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-combined-ca-bundle\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.221556 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a21845-f322-4fd9-9e6a-2d988ecf13a1-config-data\") pod \"octavia-api-5cc8d89f58-4k72z\" (UID: \"97a21845-f322-4fd9-9e6a-2d988ecf13a1\") " pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.270382 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:02 crc kubenswrapper[5200]: W1126 15:08:02.778437 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97a21845_f322_4fd9_9e6a_2d988ecf13a1.slice/crio-99c4cfef5ff810ea5ac3af4853c47a759a08ca1ba109c3c48183be4c111681de WatchSource:0}: Error finding container 99c4cfef5ff810ea5ac3af4853c47a759a08ca1ba109c3c48183be4c111681de: Status 404 returned error can't find the container with id 99c4cfef5ff810ea5ac3af4853c47a759a08ca1ba109c3c48183be4c111681de Nov 26 15:08:02 crc kubenswrapper[5200]: I1126 15:08:02.780800 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-5cc8d89f58-4k72z"] Nov 26 15:08:03 crc kubenswrapper[5200]: I1126 15:08:03.043250 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5cc8d89f58-4k72z" event={"ID":"97a21845-f322-4fd9-9e6a-2d988ecf13a1","Type":"ContainerStarted","Data":"99c4cfef5ff810ea5ac3af4853c47a759a08ca1ba109c3c48183be4c111681de"} Nov 26 15:08:11 crc kubenswrapper[5200]: I1126 15:08:11.970450 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:08:11 crc kubenswrapper[5200]: E1126 15:08:11.971370 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:08:19 crc kubenswrapper[5200]: I1126 15:08:19.220564 5200 generic.go:358] "Generic (PLEG): container finished" podID="97a21845-f322-4fd9-9e6a-2d988ecf13a1" containerID="38cf71e13106f7ad9c3468aa860bb9a1ff483d5df3159116f14c3dd57316d667" exitCode=0 Nov 26 15:08:19 crc kubenswrapper[5200]: I1126 15:08:19.220787 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5cc8d89f58-4k72z" event={"ID":"97a21845-f322-4fd9-9e6a-2d988ecf13a1","Type":"ContainerDied","Data":"38cf71e13106f7ad9c3468aa860bb9a1ff483d5df3159116f14c3dd57316d667"} Nov 26 15:08:19 crc kubenswrapper[5200]: I1126 15:08:19.859365 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vqkcw" podUID="b2e3f416-2e99-40ef-b104-c2ef740b5a2b" containerName="ovn-controller" probeResult="failure" output=< Nov 26 15:08:19 crc kubenswrapper[5200]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 26 15:08:19 crc kubenswrapper[5200]: > Nov 26 15:08:20 crc kubenswrapper[5200]: I1126 15:08:20.237255 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5cc8d89f58-4k72z" event={"ID":"97a21845-f322-4fd9-9e6a-2d988ecf13a1","Type":"ContainerStarted","Data":"bbfc238e70423468122661814d247492257365a096ffe7d61dc7af38d47b7361"} Nov 26 15:08:20 crc kubenswrapper[5200]: I1126 15:08:20.237718 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:20 crc kubenswrapper[5200]: I1126 15:08:20.237739 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:20 crc kubenswrapper[5200]: I1126 15:08:20.237749 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5cc8d89f58-4k72z" event={"ID":"97a21845-f322-4fd9-9e6a-2d988ecf13a1","Type":"ContainerStarted","Data":"d5a60577f4fb5fcb7780b292fbd81200ea0d9c49e105e1c6d0930931d7f6280f"} Nov 26 15:08:20 crc kubenswrapper[5200]: I1126 15:08:20.269804 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-5cc8d89f58-4k72z" podStartSLOduration=3.800751693 podStartE2EDuration="19.269783519s" podCreationTimestamp="2025-11-26 15:08:01 +0000 UTC" firstStartedPulling="2025-11-26 15:08:02.781591832 +0000 UTC m=+5851.460793650" lastFinishedPulling="2025-11-26 15:08:18.250623658 +0000 UTC m=+5866.929825476" observedRunningTime="2025-11-26 15:08:20.26001978 +0000 UTC m=+5868.939221588" watchObservedRunningTime="2025-11-26 15:08:20.269783519 +0000 UTC m=+5868.948985337" Nov 26 15:08:22 crc kubenswrapper[5200]: I1126 15:08:22.923367 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:08:22 crc kubenswrapper[5200]: I1126 15:08:22.928288 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-t88m7" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.155198 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vqkcw-config-hzkbz"] Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.183844 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vqkcw-config-hzkbz"] Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.183981 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.190142 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-extra-scripts\"" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.296105 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-additional-scripts\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.296196 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run-ovn\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.296369 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk5dd\" (UniqueName: \"kubernetes.io/projected/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-kube-api-access-dk5dd\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.296431 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.296539 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-log-ovn\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.296584 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-scripts\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.399128 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-scripts\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.399358 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-additional-scripts\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.399421 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run-ovn\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.399653 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk5dd\" (UniqueName: \"kubernetes.io/projected/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-kube-api-access-dk5dd\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.399832 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.399936 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-log-ovn\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.399952 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run-ovn\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.400049 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-log-ovn\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.400096 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.400385 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-additional-scripts\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.402573 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-scripts\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.420599 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk5dd\" (UniqueName: \"kubernetes.io/projected/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-kube-api-access-dk5dd\") pod \"ovn-controller-vqkcw-config-hzkbz\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.513852 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:23 crc kubenswrapper[5200]: I1126 15:08:23.977308 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vqkcw-config-hzkbz"] Nov 26 15:08:24 crc kubenswrapper[5200]: I1126 15:08:24.295126 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vqkcw-config-hzkbz" event={"ID":"1dcaf25b-59ca-4d6d-a5b4-843cd246693f","Type":"ContainerStarted","Data":"906205e820bf3c20c09dada996d6b1329bc17982d07eb660100816d97072c1a9"} Nov 26 15:08:24 crc kubenswrapper[5200]: I1126 15:08:24.865515 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vqkcw" Nov 26 15:08:24 crc kubenswrapper[5200]: I1126 15:08:24.972026 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:08:24 crc kubenswrapper[5200]: E1126 15:08:24.972366 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:08:25 crc kubenswrapper[5200]: I1126 15:08:25.309218 5200 generic.go:358] "Generic (PLEG): container finished" podID="1dcaf25b-59ca-4d6d-a5b4-843cd246693f" containerID="ac542146776fe8648e3c5dd64dc05b58360d195975eeab1901b6239405f1e2ab" exitCode=0 Nov 26 15:08:25 crc kubenswrapper[5200]: I1126 15:08:25.309395 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vqkcw-config-hzkbz" event={"ID":"1dcaf25b-59ca-4d6d-a5b4-843cd246693f","Type":"ContainerDied","Data":"ac542146776fe8648e3c5dd64dc05b58360d195975eeab1901b6239405f1e2ab"} Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.648202 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782346 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run\") pod \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782444 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk5dd\" (UniqueName: \"kubernetes.io/projected/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-kube-api-access-dk5dd\") pod \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782451 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run" (OuterVolumeSpecName: "var-run") pod "1dcaf25b-59ca-4d6d-a5b4-843cd246693f" (UID: "1dcaf25b-59ca-4d6d-a5b4-843cd246693f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782486 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run-ovn\") pod \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782514 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-log-ovn\") pod \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782610 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1dcaf25b-59ca-4d6d-a5b4-843cd246693f" (UID: "1dcaf25b-59ca-4d6d-a5b4-843cd246693f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782728 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-additional-scripts\") pod \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782770 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-scripts\") pod \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\" (UID: \"1dcaf25b-59ca-4d6d-a5b4-843cd246693f\") " Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.782763 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1dcaf25b-59ca-4d6d-a5b4-843cd246693f" (UID: "1dcaf25b-59ca-4d6d-a5b4-843cd246693f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.783147 5200 reconciler_common.go:299] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.783159 5200 reconciler_common.go:299] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.783168 5200 reconciler_common.go:299] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-var-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.783193 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1dcaf25b-59ca-4d6d-a5b4-843cd246693f" (UID: "1dcaf25b-59ca-4d6d-a5b4-843cd246693f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.784475 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-scripts" (OuterVolumeSpecName: "scripts") pod "1dcaf25b-59ca-4d6d-a5b4-843cd246693f" (UID: "1dcaf25b-59ca-4d6d-a5b4-843cd246693f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.788496 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-kube-api-access-dk5dd" (OuterVolumeSpecName: "kube-api-access-dk5dd") pod "1dcaf25b-59ca-4d6d-a5b4-843cd246693f" (UID: "1dcaf25b-59ca-4d6d-a5b4-843cd246693f"). InnerVolumeSpecName "kube-api-access-dk5dd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.885043 5200 reconciler_common.go:299] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.885081 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:26 crc kubenswrapper[5200]: I1126 15:08:26.885093 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dk5dd\" (UniqueName: \"kubernetes.io/projected/1dcaf25b-59ca-4d6d-a5b4-843cd246693f-kube-api-access-dk5dd\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:27 crc kubenswrapper[5200]: I1126 15:08:27.330927 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vqkcw-config-hzkbz" Nov 26 15:08:27 crc kubenswrapper[5200]: I1126 15:08:27.330955 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vqkcw-config-hzkbz" event={"ID":"1dcaf25b-59ca-4d6d-a5b4-843cd246693f","Type":"ContainerDied","Data":"906205e820bf3c20c09dada996d6b1329bc17982d07eb660100816d97072c1a9"} Nov 26 15:08:27 crc kubenswrapper[5200]: I1126 15:08:27.331008 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="906205e820bf3c20c09dada996d6b1329bc17982d07eb660100816d97072c1a9" Nov 26 15:08:27 crc kubenswrapper[5200]: I1126 15:08:27.748909 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vqkcw-config-hzkbz"] Nov 26 15:08:27 crc kubenswrapper[5200]: I1126 15:08:27.799028 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vqkcw-config-hzkbz"] Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.175380 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-k5cn8"] Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.177039 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1dcaf25b-59ca-4d6d-a5b4-843cd246693f" containerName="ovn-config" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.177071 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcaf25b-59ca-4d6d-a5b4-843cd246693f" containerName="ovn-config" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.177292 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1dcaf25b-59ca-4d6d-a5b4-843cd246693f" containerName="ovn-config" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.185342 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.188184 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-k5cn8"] Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.190123 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-rsyslog-scripts\"" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.190333 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-rsyslog-config-data\"" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.190385 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"octavia-hmport-map\"" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.319435 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f39891c5-2a90-40c2-b035-e19b9b63363a-hm-ports\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.319486 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f39891c5-2a90-40c2-b035-e19b9b63363a-config-data-merged\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.319525 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f39891c5-2a90-40c2-b035-e19b9b63363a-scripts\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.319715 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f39891c5-2a90-40c2-b035-e19b9b63363a-config-data\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.422363 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f39891c5-2a90-40c2-b035-e19b9b63363a-config-data\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.422554 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f39891c5-2a90-40c2-b035-e19b9b63363a-hm-ports\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.422585 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f39891c5-2a90-40c2-b035-e19b9b63363a-config-data-merged\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.422619 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f39891c5-2a90-40c2-b035-e19b9b63363a-scripts\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.423379 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f39891c5-2a90-40c2-b035-e19b9b63363a-config-data-merged\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.423763 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f39891c5-2a90-40c2-b035-e19b9b63363a-hm-ports\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.429540 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f39891c5-2a90-40c2-b035-e19b9b63363a-config-data\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.434912 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f39891c5-2a90-40c2-b035-e19b9b63363a-scripts\") pod \"octavia-rsyslog-k5cn8\" (UID: \"f39891c5-2a90-40c2-b035-e19b9b63363a\") " pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.522647 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.767175 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-kzdqh"] Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.774916 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.779003 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-config-data\"" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.787661 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-kzdqh"] Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.824016 5200 scope.go:117] "RemoveContainer" containerID="deeb37e98121744b207b8aaa4e6b9e9fc0f058894aa41541f3f261a7821aea76" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.878070 5200 scope.go:117] "RemoveContainer" containerID="ec62efc77353b3b48cb443a7bfaa527aef9aee00631706174ff25b42d3925b8a" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.921298 5200 scope.go:117] "RemoveContainer" containerID="1106ff9622ccfd2e054ba2412a00d94be6f0fbda0acbb7c41719b4059c523b51" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.932158 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2f81c7cb-6771-4fb8-b392-f1ee540fba03-amphora-image\") pod \"octavia-image-upload-fcc7bff87-kzdqh\" (UID: \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\") " pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.932329 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f81c7cb-6771-4fb8-b392-f1ee540fba03-httpd-config\") pod \"octavia-image-upload-fcc7bff87-kzdqh\" (UID: \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\") " pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.978535 5200 scope.go:117] "RemoveContainer" containerID="6075b08fbd21839b843d7d68d1b8756c6fd3c0667848854ebae0d50349ed9fdd" Nov 26 15:08:28 crc kubenswrapper[5200]: I1126 15:08:28.989275 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcaf25b-59ca-4d6d-a5b4-843cd246693f" path="/var/lib/kubelet/pods/1dcaf25b-59ca-4d6d-a5b4-843cd246693f/volumes" Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.013981 5200 scope.go:117] "RemoveContainer" containerID="7f39c454ac7e798c226e8e6f42eda2a9f0f7831bf59407a3071188f6f1ec3d4f" Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.035206 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f81c7cb-6771-4fb8-b392-f1ee540fba03-httpd-config\") pod \"octavia-image-upload-fcc7bff87-kzdqh\" (UID: \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\") " pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.035294 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2f81c7cb-6771-4fb8-b392-f1ee540fba03-amphora-image\") pod \"octavia-image-upload-fcc7bff87-kzdqh\" (UID: \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\") " pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.036126 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2f81c7cb-6771-4fb8-b392-f1ee540fba03-amphora-image\") pod \"octavia-image-upload-fcc7bff87-kzdqh\" (UID: \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\") " pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.043645 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f81c7cb-6771-4fb8-b392-f1ee540fba03-httpd-config\") pod \"octavia-image-upload-fcc7bff87-kzdqh\" (UID: \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\") " pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.084280 5200 scope.go:117] "RemoveContainer" containerID="e63d46ec4f4eeacb453324a2dbf8ad058535b8e7b5379224415b65ed9edf3a08" Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.101000 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.113137 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-k5cn8"] Nov 26 15:08:29 crc kubenswrapper[5200]: W1126 15:08:29.129549 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf39891c5_2a90_40c2_b035_e19b9b63363a.slice/crio-f92d6a610ad547ee37f8dc6e1aca24fb90db0d82aa8658f770cd2dd4fa48ce40 WatchSource:0}: Error finding container f92d6a610ad547ee37f8dc6e1aca24fb90db0d82aa8658f770cd2dd4fa48ce40: Status 404 returned error can't find the container with id f92d6a610ad547ee37f8dc6e1aca24fb90db0d82aa8658f770cd2dd4fa48ce40 Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.351802 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-k5cn8" event={"ID":"f39891c5-2a90-40c2-b035-e19b9b63363a","Type":"ContainerStarted","Data":"f92d6a610ad547ee37f8dc6e1aca24fb90db0d82aa8658f770cd2dd4fa48ce40"} Nov 26 15:08:29 crc kubenswrapper[5200]: I1126 15:08:29.585561 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-kzdqh"] Nov 26 15:08:30 crc kubenswrapper[5200]: I1126 15:08:30.362357 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" event={"ID":"2f81c7cb-6771-4fb8-b392-f1ee540fba03","Type":"ContainerStarted","Data":"b723640dff1618f9129629ebaa0e411ed5e3838394cd84ec0f2759681edf3f26"} Nov 26 15:08:33 crc kubenswrapper[5200]: I1126 15:08:33.397984 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-k5cn8" event={"ID":"f39891c5-2a90-40c2-b035-e19b9b63363a","Type":"ContainerStarted","Data":"99d552314a17351695c57d7c737e818c0268246b9cf1c7bb4a102c7ef085339e"} Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.691532 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-4wzlf"] Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.704081 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.705242 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-4wzlf"] Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.708806 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-scripts\"" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.800644 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-combined-ca-bundle\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.800734 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.800780 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-scripts\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.800805 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data-merged\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.903186 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-scripts\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.903240 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data-merged\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.903509 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-combined-ca-bundle\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.903545 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.904997 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data-merged\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.915748 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.926797 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-scripts\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:34 crc kubenswrapper[5200]: I1126 15:08:34.938042 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-combined-ca-bundle\") pod \"octavia-db-sync-4wzlf\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:35 crc kubenswrapper[5200]: I1126 15:08:35.036520 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:35 crc kubenswrapper[5200]: I1126 15:08:35.424971 5200 generic.go:358] "Generic (PLEG): container finished" podID="f39891c5-2a90-40c2-b035-e19b9b63363a" containerID="99d552314a17351695c57d7c737e818c0268246b9cf1c7bb4a102c7ef085339e" exitCode=0 Nov 26 15:08:35 crc kubenswrapper[5200]: I1126 15:08:35.425454 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-k5cn8" event={"ID":"f39891c5-2a90-40c2-b035-e19b9b63363a","Type":"ContainerDied","Data":"99d552314a17351695c57d7c737e818c0268246b9cf1c7bb4a102c7ef085339e"} Nov 26 15:08:35 crc kubenswrapper[5200]: I1126 15:08:35.566446 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-4wzlf"] Nov 26 15:08:36 crc kubenswrapper[5200]: I1126 15:08:36.448311 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-4wzlf" event={"ID":"9bab952e-c8d7-4737-a59a-ef8c81bb6256","Type":"ContainerStarted","Data":"f1240df14dd6cca9ccee3abf28b60c255c51aedca1c7eb377146b85d9d606d09"} Nov 26 15:08:36 crc kubenswrapper[5200]: I1126 15:08:36.970103 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:08:36 crc kubenswrapper[5200]: E1126 15:08:36.970792 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:08:41 crc kubenswrapper[5200]: I1126 15:08:41.777044 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:41 crc kubenswrapper[5200]: I1126 15:08:41.787512 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-5cc8d89f58-4k72z" Nov 26 15:08:42 crc kubenswrapper[5200]: E1126 15:08:42.646409 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1969856 actualBytes=10240 Nov 26 15:08:47 crc kubenswrapper[5200]: I1126 15:08:47.581530 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" event={"ID":"2f81c7cb-6771-4fb8-b392-f1ee540fba03","Type":"ContainerStarted","Data":"2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc"} Nov 26 15:08:47 crc kubenswrapper[5200]: I1126 15:08:47.585905 5200 generic.go:358] "Generic (PLEG): container finished" podID="9bab952e-c8d7-4737-a59a-ef8c81bb6256" containerID="9d21a098df1541a5fc40993e87159e3d19972c6f3c61f5b30e2e9b7c54b5d57e" exitCode=0 Nov 26 15:08:47 crc kubenswrapper[5200]: I1126 15:08:47.586071 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-4wzlf" event={"ID":"9bab952e-c8d7-4737-a59a-ef8c81bb6256","Type":"ContainerDied","Data":"9d21a098df1541a5fc40993e87159e3d19972c6f3c61f5b30e2e9b7c54b5d57e"} Nov 26 15:08:48 crc kubenswrapper[5200]: I1126 15:08:48.597185 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-4wzlf" event={"ID":"9bab952e-c8d7-4737-a59a-ef8c81bb6256","Type":"ContainerStarted","Data":"33bca999f0b0ac65592ec44212465f5970290b732cf7a16cb569690e64156937"} Nov 26 15:08:48 crc kubenswrapper[5200]: I1126 15:08:48.603650 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-k5cn8" event={"ID":"f39891c5-2a90-40c2-b035-e19b9b63363a","Type":"ContainerStarted","Data":"cec014b64e787113544708833d348305751fe37ccd82343e07482178e67cc503"} Nov 26 15:08:48 crc kubenswrapper[5200]: I1126 15:08:48.603915 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:08:48 crc kubenswrapper[5200]: I1126 15:08:48.609407 5200 generic.go:358] "Generic (PLEG): container finished" podID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" containerID="2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc" exitCode=0 Nov 26 15:08:48 crc kubenswrapper[5200]: I1126 15:08:48.609481 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" event={"ID":"2f81c7cb-6771-4fb8-b392-f1ee540fba03","Type":"ContainerDied","Data":"2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc"} Nov 26 15:08:48 crc kubenswrapper[5200]: I1126 15:08:48.624165 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-4wzlf" podStartSLOduration=14.624140208 podStartE2EDuration="14.624140208s" podCreationTimestamp="2025-11-26 15:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:08:48.613210587 +0000 UTC m=+5897.292412415" watchObservedRunningTime="2025-11-26 15:08:48.624140208 +0000 UTC m=+5897.303342026" Nov 26 15:08:48 crc kubenswrapper[5200]: I1126 15:08:48.663331 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-k5cn8" podStartSLOduration=1.772042076 podStartE2EDuration="20.663309309s" podCreationTimestamp="2025-11-26 15:08:28 +0000 UTC" firstStartedPulling="2025-11-26 15:08:29.142348241 +0000 UTC m=+5877.821550059" lastFinishedPulling="2025-11-26 15:08:48.033615474 +0000 UTC m=+5896.712817292" observedRunningTime="2025-11-26 15:08:48.650914799 +0000 UTC m=+5897.330116637" watchObservedRunningTime="2025-11-26 15:08:48.663309309 +0000 UTC m=+5897.342511117" Nov 26 15:08:49 crc kubenswrapper[5200]: I1126 15:08:49.970168 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:08:49 crc kubenswrapper[5200]: E1126 15:08:49.970833 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:08:50 crc kubenswrapper[5200]: I1126 15:08:50.630097 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" event={"ID":"2f81c7cb-6771-4fb8-b392-f1ee540fba03","Type":"ContainerStarted","Data":"624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e"} Nov 26 15:08:50 crc kubenswrapper[5200]: I1126 15:08:50.633172 5200 generic.go:358] "Generic (PLEG): container finished" podID="9bab952e-c8d7-4737-a59a-ef8c81bb6256" containerID="33bca999f0b0ac65592ec44212465f5970290b732cf7a16cb569690e64156937" exitCode=0 Nov 26 15:08:50 crc kubenswrapper[5200]: I1126 15:08:50.633346 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-4wzlf" event={"ID":"9bab952e-c8d7-4737-a59a-ef8c81bb6256","Type":"ContainerDied","Data":"33bca999f0b0ac65592ec44212465f5970290b732cf7a16cb569690e64156937"} Nov 26 15:08:50 crc kubenswrapper[5200]: I1126 15:08:50.645891 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" podStartSLOduration=2.038140287 podStartE2EDuration="22.645877746s" podCreationTimestamp="2025-11-26 15:08:28 +0000 UTC" firstStartedPulling="2025-11-26 15:08:29.598076492 +0000 UTC m=+5878.277278310" lastFinishedPulling="2025-11-26 15:08:50.205813951 +0000 UTC m=+5898.885015769" observedRunningTime="2025-11-26 15:08:50.644967212 +0000 UTC m=+5899.324169040" watchObservedRunningTime="2025-11-26 15:08:50.645877746 +0000 UTC m=+5899.325079564" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.039202 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.136757 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-combined-ca-bundle\") pod \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.137002 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data-merged\") pod \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.137049 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-scripts\") pod \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.137290 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data\") pod \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\" (UID: \"9bab952e-c8d7-4737-a59a-ef8c81bb6256\") " Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.142176 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-scripts" (OuterVolumeSpecName: "scripts") pod "9bab952e-c8d7-4737-a59a-ef8c81bb6256" (UID: "9bab952e-c8d7-4737-a59a-ef8c81bb6256"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.142481 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data" (OuterVolumeSpecName: "config-data") pod "9bab952e-c8d7-4737-a59a-ef8c81bb6256" (UID: "9bab952e-c8d7-4737-a59a-ef8c81bb6256"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.160384 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "9bab952e-c8d7-4737-a59a-ef8c81bb6256" (UID: "9bab952e-c8d7-4737-a59a-ef8c81bb6256"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.164032 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bab952e-c8d7-4737-a59a-ef8c81bb6256" (UID: "9bab952e-c8d7-4737-a59a-ef8c81bb6256"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.239079 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.239112 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.239123 5200 reconciler_common.go:299] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9bab952e-c8d7-4737-a59a-ef8c81bb6256-config-data-merged\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.239131 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bab952e-c8d7-4737-a59a-ef8c81bb6256-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.654731 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-4wzlf" event={"ID":"9bab952e-c8d7-4737-a59a-ef8c81bb6256","Type":"ContainerDied","Data":"f1240df14dd6cca9ccee3abf28b60c255c51aedca1c7eb377146b85d9d606d09"} Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.654779 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1240df14dd6cca9ccee3abf28b60c255c51aedca1c7eb377146b85d9d606d09" Nov 26 15:08:52 crc kubenswrapper[5200]: I1126 15:08:52.655924 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-4wzlf" Nov 26 15:09:01 crc kubenswrapper[5200]: I1126 15:09:01.970318 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:09:01 crc kubenswrapper[5200]: E1126 15:09:01.971198 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:09:04 crc kubenswrapper[5200]: I1126 15:09:04.661658 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-k5cn8" Nov 26 15:09:15 crc kubenswrapper[5200]: I1126 15:09:15.969340 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:09:15 crc kubenswrapper[5200]: E1126 15:09:15.970131 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:09:16 crc kubenswrapper[5200]: I1126 15:09:16.849446 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-kzdqh"] Nov 26 15:09:16 crc kubenswrapper[5200]: I1126 15:09:16.850060 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" podUID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" containerName="octavia-amphora-httpd" containerID="cri-o://624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e" gracePeriod=30 Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.466764 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.588181 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2f81c7cb-6771-4fb8-b392-f1ee540fba03-amphora-image\") pod \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\" (UID: \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\") " Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.588489 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f81c7cb-6771-4fb8-b392-f1ee540fba03-httpd-config\") pod \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\" (UID: \"2f81c7cb-6771-4fb8-b392-f1ee540fba03\") " Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.620636 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f81c7cb-6771-4fb8-b392-f1ee540fba03-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2f81c7cb-6771-4fb8-b392-f1ee540fba03" (UID: "2f81c7cb-6771-4fb8-b392-f1ee540fba03"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.682902 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f81c7cb-6771-4fb8-b392-f1ee540fba03-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "2f81c7cb-6771-4fb8-b392-f1ee540fba03" (UID: "2f81c7cb-6771-4fb8-b392-f1ee540fba03"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.692721 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f81c7cb-6771-4fb8-b392-f1ee540fba03-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.692753 5200 reconciler_common.go:299] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2f81c7cb-6771-4fb8-b392-f1ee540fba03-amphora-image\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.930495 5200 generic.go:358] "Generic (PLEG): container finished" podID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" containerID="624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e" exitCode=0 Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.930580 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.930605 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" event={"ID":"2f81c7cb-6771-4fb8-b392-f1ee540fba03","Type":"ContainerDied","Data":"624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e"} Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.930653 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-kzdqh" event={"ID":"2f81c7cb-6771-4fb8-b392-f1ee540fba03","Type":"ContainerDied","Data":"b723640dff1618f9129629ebaa0e411ed5e3838394cd84ec0f2759681edf3f26"} Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.930677 5200 scope.go:117] "RemoveContainer" containerID="624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.965477 5200 scope.go:117] "RemoveContainer" containerID="2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.974126 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-kzdqh"] Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.985562 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-kzdqh"] Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.991948 5200 scope.go:117] "RemoveContainer" containerID="624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e" Nov 26 15:09:17 crc kubenswrapper[5200]: E1126 15:09:17.992307 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e\": container with ID starting with 624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e not found: ID does not exist" containerID="624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.992345 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e"} err="failed to get container status \"624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e\": rpc error: code = NotFound desc = could not find container \"624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e\": container with ID starting with 624fb4e6654e5e3898894a84a2d4b870058de416c1c69857483b427fcee7219e not found: ID does not exist" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.992370 5200 scope.go:117] "RemoveContainer" containerID="2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc" Nov 26 15:09:17 crc kubenswrapper[5200]: E1126 15:09:17.992583 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc\": container with ID starting with 2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc not found: ID does not exist" containerID="2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc" Nov 26 15:09:17 crc kubenswrapper[5200]: I1126 15:09:17.992614 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc"} err="failed to get container status \"2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc\": rpc error: code = NotFound desc = could not find container \"2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc\": container with ID starting with 2b9c7741dd176e2643664af1fddd005e98dd02873ca6a12dac7d10c423ed83dc not found: ID does not exist" Nov 26 15:09:18 crc kubenswrapper[5200]: I1126 15:09:18.979891 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" path="/var/lib/kubelet/pods/2f81c7cb-6771-4fb8-b392-f1ee540fba03/volumes" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.061898 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-qqxfk"] Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.063719 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" containerName="octavia-amphora-httpd" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.063745 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" containerName="octavia-amphora-httpd" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.063785 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bab952e-c8d7-4737-a59a-ef8c81bb6256" containerName="init" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.063794 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bab952e-c8d7-4737-a59a-ef8c81bb6256" containerName="init" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.063813 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" containerName="init" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.063820 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" containerName="init" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.063864 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bab952e-c8d7-4737-a59a-ef8c81bb6256" containerName="octavia-db-sync" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.063872 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bab952e-c8d7-4737-a59a-ef8c81bb6256" containerName="octavia-db-sync" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.066890 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f81c7cb-6771-4fb8-b392-f1ee540fba03" containerName="octavia-amphora-httpd" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.066955 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bab952e-c8d7-4737-a59a-ef8c81bb6256" containerName="octavia-db-sync" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.081273 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.087263 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-config-data\"" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.121566 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-qqxfk"] Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.139075 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60d99396-e942-4d33-88ff-f0c5e8b135a0-httpd-config\") pod \"octavia-image-upload-fcc7bff87-qqxfk\" (UID: \"60d99396-e942-4d33-88ff-f0c5e8b135a0\") " pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.139271 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/60d99396-e942-4d33-88ff-f0c5e8b135a0-amphora-image\") pod \"octavia-image-upload-fcc7bff87-qqxfk\" (UID: \"60d99396-e942-4d33-88ff-f0c5e8b135a0\") " pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.240557 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60d99396-e942-4d33-88ff-f0c5e8b135a0-httpd-config\") pod \"octavia-image-upload-fcc7bff87-qqxfk\" (UID: \"60d99396-e942-4d33-88ff-f0c5e8b135a0\") " pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.240773 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/60d99396-e942-4d33-88ff-f0c5e8b135a0-amphora-image\") pod \"octavia-image-upload-fcc7bff87-qqxfk\" (UID: \"60d99396-e942-4d33-88ff-f0c5e8b135a0\") " pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.241450 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/60d99396-e942-4d33-88ff-f0c5e8b135a0-amphora-image\") pod \"octavia-image-upload-fcc7bff87-qqxfk\" (UID: \"60d99396-e942-4d33-88ff-f0c5e8b135a0\") " pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.257474 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60d99396-e942-4d33-88ff-f0c5e8b135a0-httpd-config\") pod \"octavia-image-upload-fcc7bff87-qqxfk\" (UID: \"60d99396-e942-4d33-88ff-f0c5e8b135a0\") " pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.413109 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.915516 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-fcc7bff87-qqxfk"] Nov 26 15:09:20 crc kubenswrapper[5200]: I1126 15:09:20.965731 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" event={"ID":"60d99396-e942-4d33-88ff-f0c5e8b135a0","Type":"ContainerStarted","Data":"fa41b2595569dacf2890856efb81a4e3892de703001c61cd9f4e44975e3f628d"} Nov 26 15:09:21 crc kubenswrapper[5200]: I1126 15:09:21.978328 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" event={"ID":"60d99396-e942-4d33-88ff-f0c5e8b135a0","Type":"ContainerStarted","Data":"734ddac36da8cd952ea5e9c8db777e2c83dcb1ef5a522239641af77767487b09"} Nov 26 15:09:22 crc kubenswrapper[5200]: I1126 15:09:22.994586 5200 generic.go:358] "Generic (PLEG): container finished" podID="60d99396-e942-4d33-88ff-f0c5e8b135a0" containerID="734ddac36da8cd952ea5e9c8db777e2c83dcb1ef5a522239641af77767487b09" exitCode=0 Nov 26 15:09:22 crc kubenswrapper[5200]: I1126 15:09:22.994744 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" event={"ID":"60d99396-e942-4d33-88ff-f0c5e8b135a0","Type":"ContainerDied","Data":"734ddac36da8cd952ea5e9c8db777e2c83dcb1ef5a522239641af77767487b09"} Nov 26 15:09:25 crc kubenswrapper[5200]: I1126 15:09:25.021560 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" event={"ID":"60d99396-e942-4d33-88ff-f0c5e8b135a0","Type":"ContainerStarted","Data":"32d691bf421a5819f317dfa103f7e9764883426e4fa0da1b525b36f59fd7c298"} Nov 26 15:09:25 crc kubenswrapper[5200]: I1126 15:09:25.047419 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-fcc7bff87-qqxfk" podStartSLOduration=1.384260531 podStartE2EDuration="5.04739646s" podCreationTimestamp="2025-11-26 15:09:20 +0000 UTC" firstStartedPulling="2025-11-26 15:09:20.910019529 +0000 UTC m=+5929.589221347" lastFinishedPulling="2025-11-26 15:09:24.573155458 +0000 UTC m=+5933.252357276" observedRunningTime="2025-11-26 15:09:25.032357952 +0000 UTC m=+5933.711559780" watchObservedRunningTime="2025-11-26 15:09:25.04739646 +0000 UTC m=+5933.726598278" Nov 26 15:09:28 crc kubenswrapper[5200]: I1126 15:09:28.970766 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:09:28 crc kubenswrapper[5200]: E1126 15:09:28.971525 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.626792 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-nnl9w"] Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.639136 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.641367 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-healthmanager-config-data\"" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.642420 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-healthmanager-scripts\"" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.646065 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-certs-secret\"" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.647276 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-nnl9w"] Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.721883 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/43065d0b-e6ea-4c72-b512-58c58235ede5-config-data-merged\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.722243 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/43065d0b-e6ea-4c72-b512-58c58235ede5-hm-ports\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.722327 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-combined-ca-bundle\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.722362 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-amphora-certs\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.722397 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-config-data\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.722417 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-scripts\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.825034 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/43065d0b-e6ea-4c72-b512-58c58235ede5-hm-ports\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.825143 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-combined-ca-bundle\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.825184 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-amphora-certs\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.825220 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-config-data\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.825240 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-scripts\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.825282 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/43065d0b-e6ea-4c72-b512-58c58235ede5-config-data-merged\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.825890 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/43065d0b-e6ea-4c72-b512-58c58235ede5-config-data-merged\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.827242 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/43065d0b-e6ea-4c72-b512-58c58235ede5-hm-ports\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.831555 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-config-data\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.832362 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-amphora-certs\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.832581 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-combined-ca-bundle\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.832888 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43065d0b-e6ea-4c72-b512-58c58235ede5-scripts\") pod \"octavia-healthmanager-nnl9w\" (UID: \"43065d0b-e6ea-4c72-b512-58c58235ede5\") " pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:32 crc kubenswrapper[5200]: I1126 15:09:32.963840 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:33 crc kubenswrapper[5200]: I1126 15:09:33.544238 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-nnl9w"] Nov 26 15:09:33 crc kubenswrapper[5200]: W1126 15:09:33.548449 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43065d0b_e6ea_4c72_b512_58c58235ede5.slice/crio-5077007a32c9a31f81f89475abfa8954b2505349ec1bd2a824bd0e009309a655 WatchSource:0}: Error finding container 5077007a32c9a31f81f89475abfa8954b2505349ec1bd2a824bd0e009309a655: Status 404 returned error can't find the container with id 5077007a32c9a31f81f89475abfa8954b2505349ec1bd2a824bd0e009309a655 Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.092511 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-rd8d9"] Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.117839 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-nnl9w" event={"ID":"43065d0b-e6ea-4c72-b512-58c58235ede5","Type":"ContainerStarted","Data":"318dbb4eed90c2b01737bd1b9fa689fcaa9a87415401165e949c4f830326bb70"} Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.117878 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-nnl9w" event={"ID":"43065d0b-e6ea-4c72-b512-58c58235ede5","Type":"ContainerStarted","Data":"5077007a32c9a31f81f89475abfa8954b2505349ec1bd2a824bd0e009309a655"} Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.117891 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-rd8d9"] Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.117997 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.120624 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-housekeeping-scripts\"" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.121079 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-housekeeping-config-data\"" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.261023 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0eff5089-5023-4c2f-930f-1bc252c879a9-config-data-merged\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.261182 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-scripts\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.261255 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0eff5089-5023-4c2f-930f-1bc252c879a9-hm-ports\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.261363 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-amphora-certs\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.261461 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-combined-ca-bundle\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.261530 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-config-data\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.363669 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-combined-ca-bundle\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.363764 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-config-data\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.363825 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0eff5089-5023-4c2f-930f-1bc252c879a9-config-data-merged\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.363924 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-scripts\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.363988 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0eff5089-5023-4c2f-930f-1bc252c879a9-hm-ports\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.364016 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-amphora-certs\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.364908 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0eff5089-5023-4c2f-930f-1bc252c879a9-config-data-merged\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.365549 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0eff5089-5023-4c2f-930f-1bc252c879a9-hm-ports\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.370416 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-amphora-certs\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.370724 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-scripts\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.370754 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-config-data\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.388374 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff5089-5023-4c2f-930f-1bc252c879a9-combined-ca-bundle\") pod \"octavia-housekeeping-rd8d9\" (UID: \"0eff5089-5023-4c2f-930f-1bc252c879a9\") " pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.442271 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.859297 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-rwxpq"] Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.871196 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.872155 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-rwxpq"] Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.875814 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-worker-scripts\"" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.876895 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"octavia-worker-config-data\"" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.976292 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-combined-ca-bundle\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.976365 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-amphora-certs\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.976386 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/01edcd7b-aec3-46e4-8638-545742dd0821-hm-ports\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.976434 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/01edcd7b-aec3-46e4-8638-545742dd0821-config-data-merged\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.976618 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-scripts\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:34 crc kubenswrapper[5200]: I1126 15:09:34.976670 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-config-data\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.024616 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-rd8d9"] Nov 26 15:09:35 crc kubenswrapper[5200]: W1126 15:09:35.027841 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eff5089_5023_4c2f_930f_1bc252c879a9.slice/crio-f85b337aa4d4ffa1c0ad2787b7b48ecdeffd10c59f24804fa7a22d85f56cefa5 WatchSource:0}: Error finding container f85b337aa4d4ffa1c0ad2787b7b48ecdeffd10c59f24804fa7a22d85f56cefa5: Status 404 returned error can't find the container with id f85b337aa4d4ffa1c0ad2787b7b48ecdeffd10c59f24804fa7a22d85f56cefa5 Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.079414 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/01edcd7b-aec3-46e4-8638-545742dd0821-config-data-merged\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.079562 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-scripts\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.079601 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-config-data\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.079733 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-combined-ca-bundle\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.079755 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-amphora-certs\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.079769 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/01edcd7b-aec3-46e4-8638-545742dd0821-hm-ports\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.080421 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/01edcd7b-aec3-46e4-8638-545742dd0821-config-data-merged\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.081055 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/01edcd7b-aec3-46e4-8638-545742dd0821-hm-ports\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.086842 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-amphora-certs\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.087202 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-combined-ca-bundle\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.087816 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-scripts\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.088542 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01edcd7b-aec3-46e4-8638-545742dd0821-config-data\") pod \"octavia-worker-rwxpq\" (UID: \"01edcd7b-aec3-46e4-8638-545742dd0821\") " pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.130361 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-rd8d9" event={"ID":"0eff5089-5023-4c2f-930f-1bc252c879a9","Type":"ContainerStarted","Data":"f85b337aa4d4ffa1c0ad2787b7b48ecdeffd10c59f24804fa7a22d85f56cefa5"} Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.201997 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:35 crc kubenswrapper[5200]: I1126 15:09:35.777745 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-rwxpq"] Nov 26 15:09:35 crc kubenswrapper[5200]: W1126 15:09:35.786953 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01edcd7b_aec3_46e4_8638_545742dd0821.slice/crio-d2787952aa65acc236debf50ab03f860a2053e4b5b4566d9b96883136b193ca3 WatchSource:0}: Error finding container d2787952aa65acc236debf50ab03f860a2053e4b5b4566d9b96883136b193ca3: Status 404 returned error can't find the container with id d2787952aa65acc236debf50ab03f860a2053e4b5b4566d9b96883136b193ca3 Nov 26 15:09:36 crc kubenswrapper[5200]: I1126 15:09:36.144633 5200 generic.go:358] "Generic (PLEG): container finished" podID="43065d0b-e6ea-4c72-b512-58c58235ede5" containerID="318dbb4eed90c2b01737bd1b9fa689fcaa9a87415401165e949c4f830326bb70" exitCode=0 Nov 26 15:09:36 crc kubenswrapper[5200]: I1126 15:09:36.145249 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-nnl9w" event={"ID":"43065d0b-e6ea-4c72-b512-58c58235ede5","Type":"ContainerDied","Data":"318dbb4eed90c2b01737bd1b9fa689fcaa9a87415401165e949c4f830326bb70"} Nov 26 15:09:36 crc kubenswrapper[5200]: I1126 15:09:36.151922 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-rwxpq" event={"ID":"01edcd7b-aec3-46e4-8638-545742dd0821","Type":"ContainerStarted","Data":"d2787952aa65acc236debf50ab03f860a2053e4b5b4566d9b96883136b193ca3"} Nov 26 15:09:37 crc kubenswrapper[5200]: I1126 15:09:37.175267 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-nnl9w" event={"ID":"43065d0b-e6ea-4c72-b512-58c58235ede5","Type":"ContainerStarted","Data":"7169f3e718044fe47e149a54e2ffaa45c197bc7d8de8ef18c1042f1ad87482be"} Nov 26 15:09:37 crc kubenswrapper[5200]: I1126 15:09:37.176893 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:37 crc kubenswrapper[5200]: I1126 15:09:37.204123 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-nnl9w" podStartSLOduration=5.20410034 podStartE2EDuration="5.20410034s" podCreationTimestamp="2025-11-26 15:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:37.203042722 +0000 UTC m=+5945.882244550" watchObservedRunningTime="2025-11-26 15:09:37.20410034 +0000 UTC m=+5945.883302158" Nov 26 15:09:39 crc kubenswrapper[5200]: I1126 15:09:39.205113 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-rd8d9" event={"ID":"0eff5089-5023-4c2f-930f-1bc252c879a9","Type":"ContainerStarted","Data":"f6dbe61c6a3b3c90485fb3fdcf3440458d8aea571819a57f4cfcdf83e54f3069"} Nov 26 15:09:40 crc kubenswrapper[5200]: I1126 15:09:40.219731 5200 generic.go:358] "Generic (PLEG): container finished" podID="0eff5089-5023-4c2f-930f-1bc252c879a9" containerID="f6dbe61c6a3b3c90485fb3fdcf3440458d8aea571819a57f4cfcdf83e54f3069" exitCode=0 Nov 26 15:09:40 crc kubenswrapper[5200]: I1126 15:09:40.219884 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-rd8d9" event={"ID":"0eff5089-5023-4c2f-930f-1bc252c879a9","Type":"ContainerDied","Data":"f6dbe61c6a3b3c90485fb3fdcf3440458d8aea571819a57f4cfcdf83e54f3069"} Nov 26 15:09:40 crc kubenswrapper[5200]: I1126 15:09:40.223363 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-rwxpq" event={"ID":"01edcd7b-aec3-46e4-8638-545742dd0821","Type":"ContainerStarted","Data":"f4d7567e55186748b44f48236dd23081e68549f3b00ceefc9ec86a0f27285287"} Nov 26 15:09:41 crc kubenswrapper[5200]: I1126 15:09:41.235730 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-rd8d9" event={"ID":"0eff5089-5023-4c2f-930f-1bc252c879a9","Type":"ContainerStarted","Data":"2c6b3e7a3d57e7bab182914f8f6d8d1a9d0778e0564afff35c5fd6c5d4f95591"} Nov 26 15:09:41 crc kubenswrapper[5200]: I1126 15:09:41.236124 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:41 crc kubenswrapper[5200]: I1126 15:09:41.239152 5200 generic.go:358] "Generic (PLEG): container finished" podID="01edcd7b-aec3-46e4-8638-545742dd0821" containerID="f4d7567e55186748b44f48236dd23081e68549f3b00ceefc9ec86a0f27285287" exitCode=0 Nov 26 15:09:41 crc kubenswrapper[5200]: I1126 15:09:41.239294 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-rwxpq" event={"ID":"01edcd7b-aec3-46e4-8638-545742dd0821","Type":"ContainerDied","Data":"f4d7567e55186748b44f48236dd23081e68549f3b00ceefc9ec86a0f27285287"} Nov 26 15:09:41 crc kubenswrapper[5200]: I1126 15:09:41.272655 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-rd8d9" podStartSLOduration=4.657072175 podStartE2EDuration="7.272607293s" podCreationTimestamp="2025-11-26 15:09:34 +0000 UTC" firstStartedPulling="2025-11-26 15:09:35.029641254 +0000 UTC m=+5943.708843072" lastFinishedPulling="2025-11-26 15:09:37.645176372 +0000 UTC m=+5946.324378190" observedRunningTime="2025-11-26 15:09:41.257225974 +0000 UTC m=+5949.936427792" watchObservedRunningTime="2025-11-26 15:09:41.272607293 +0000 UTC m=+5949.951809141" Nov 26 15:09:42 crc kubenswrapper[5200]: I1126 15:09:42.257203 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-rwxpq" event={"ID":"01edcd7b-aec3-46e4-8638-545742dd0821","Type":"ContainerStarted","Data":"f2f637200fb2e129ebb4e667ee30eb19badc9a8faed29ebf73785d795cfa376f"} Nov 26 15:09:42 crc kubenswrapper[5200]: I1126 15:09:42.257801 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:42 crc kubenswrapper[5200]: I1126 15:09:42.280171 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-rwxpq" podStartSLOduration=5.356812899 podStartE2EDuration="8.280143109s" podCreationTimestamp="2025-11-26 15:09:34 +0000 UTC" firstStartedPulling="2025-11-26 15:09:35.78964444 +0000 UTC m=+5944.468846248" lastFinishedPulling="2025-11-26 15:09:38.71297462 +0000 UTC m=+5947.392176458" observedRunningTime="2025-11-26 15:09:42.275228908 +0000 UTC m=+5950.954430726" watchObservedRunningTime="2025-11-26 15:09:42.280143109 +0000 UTC m=+5950.959344927" Nov 26 15:09:42 crc kubenswrapper[5200]: E1126 15:09:42.543799 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2009507 actualBytes=10240 Nov 26 15:09:42 crc kubenswrapper[5200]: I1126 15:09:42.979228 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:09:42 crc kubenswrapper[5200]: E1126 15:09:42.979531 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:09:44 crc kubenswrapper[5200]: I1126 15:09:44.964214 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7svv7"] Nov 26 15:09:44 crc kubenswrapper[5200]: I1126 15:09:44.977390 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.003646 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7svv7"] Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.159183 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqlxz\" (UniqueName: \"kubernetes.io/projected/487bcfe5-a043-4cf8-a7d9-1490ae121148-kube-api-access-cqlxz\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.160478 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-catalog-content\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.160613 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-utilities\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.263382 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-catalog-content\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.263434 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-utilities\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.263628 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqlxz\" (UniqueName: \"kubernetes.io/projected/487bcfe5-a043-4cf8-a7d9-1490ae121148-kube-api-access-cqlxz\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.263963 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-catalog-content\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.264042 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-utilities\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.302923 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqlxz\" (UniqueName: \"kubernetes.io/projected/487bcfe5-a043-4cf8-a7d9-1490ae121148-kube-api-access-cqlxz\") pod \"community-operators-7svv7\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.313709 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:45 crc kubenswrapper[5200]: I1126 15:09:45.786309 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7svv7"] Nov 26 15:09:45 crc kubenswrapper[5200]: W1126 15:09:45.794076 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487bcfe5_a043_4cf8_a7d9_1490ae121148.slice/crio-f2bd1df99b31cdce6b75e9236f6a461b83706597f6c9c640b1a9069c05b6e200 WatchSource:0}: Error finding container f2bd1df99b31cdce6b75e9236f6a461b83706597f6c9c640b1a9069c05b6e200: Status 404 returned error can't find the container with id f2bd1df99b31cdce6b75e9236f6a461b83706597f6c9c640b1a9069c05b6e200 Nov 26 15:09:46 crc kubenswrapper[5200]: I1126 15:09:46.295913 5200 generic.go:358] "Generic (PLEG): container finished" podID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerID="893b8e4b44107b00eb1b9da82b94d9308f2b151c830699345b52bf50a0e6456a" exitCode=0 Nov 26 15:09:46 crc kubenswrapper[5200]: I1126 15:09:46.296111 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7svv7" event={"ID":"487bcfe5-a043-4cf8-a7d9-1490ae121148","Type":"ContainerDied","Data":"893b8e4b44107b00eb1b9da82b94d9308f2b151c830699345b52bf50a0e6456a"} Nov 26 15:09:46 crc kubenswrapper[5200]: I1126 15:09:46.296379 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7svv7" event={"ID":"487bcfe5-a043-4cf8-a7d9-1490ae121148","Type":"ContainerStarted","Data":"f2bd1df99b31cdce6b75e9236f6a461b83706597f6c9c640b1a9069c05b6e200"} Nov 26 15:09:48 crc kubenswrapper[5200]: I1126 15:09:48.317394 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7svv7" event={"ID":"487bcfe5-a043-4cf8-a7d9-1490ae121148","Type":"ContainerStarted","Data":"bbafab13bed42c22a0279322a582fc84f999c3b861830aff723737c281c2aab7"} Nov 26 15:09:50 crc kubenswrapper[5200]: I1126 15:09:50.344703 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7svv7" event={"ID":"487bcfe5-a043-4cf8-a7d9-1490ae121148","Type":"ContainerDied","Data":"bbafab13bed42c22a0279322a582fc84f999c3b861830aff723737c281c2aab7"} Nov 26 15:09:50 crc kubenswrapper[5200]: I1126 15:09:50.344705 5200 generic.go:358] "Generic (PLEG): container finished" podID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerID="bbafab13bed42c22a0279322a582fc84f999c3b861830aff723737c281c2aab7" exitCode=0 Nov 26 15:09:51 crc kubenswrapper[5200]: I1126 15:09:51.357399 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7svv7" event={"ID":"487bcfe5-a043-4cf8-a7d9-1490ae121148","Type":"ContainerStarted","Data":"3e60d37e8e4e2bb5fdc26907039428d4fa5cc5b7b8b3192f35851f5a7542b2d4"} Nov 26 15:09:51 crc kubenswrapper[5200]: I1126 15:09:51.383884 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7svv7" podStartSLOduration=6.598414951 podStartE2EDuration="7.383864183s" podCreationTimestamp="2025-11-26 15:09:44 +0000 UTC" firstStartedPulling="2025-11-26 15:09:46.29779211 +0000 UTC m=+5954.976993928" lastFinishedPulling="2025-11-26 15:09:47.083241342 +0000 UTC m=+5955.762443160" observedRunningTime="2025-11-26 15:09:51.373539139 +0000 UTC m=+5960.052740977" watchObservedRunningTime="2025-11-26 15:09:51.383864183 +0000 UTC m=+5960.063066001" Nov 26 15:09:53 crc kubenswrapper[5200]: I1126 15:09:53.226326 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-nnl9w" Nov 26 15:09:55 crc kubenswrapper[5200]: I1126 15:09:55.314254 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:55 crc kubenswrapper[5200]: I1126 15:09:55.314613 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:55 crc kubenswrapper[5200]: I1126 15:09:55.367559 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:55 crc kubenswrapper[5200]: I1126 15:09:55.504638 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:55 crc kubenswrapper[5200]: I1126 15:09:55.626967 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7svv7"] Nov 26 15:09:56 crc kubenswrapper[5200]: I1126 15:09:56.970286 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:09:56 crc kubenswrapper[5200]: E1126 15:09:56.970906 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:09:57 crc kubenswrapper[5200]: I1126 15:09:57.295227 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-rd8d9" Nov 26 15:09:57 crc kubenswrapper[5200]: I1126 15:09:57.419206 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7svv7" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerName="registry-server" containerID="cri-o://3e60d37e8e4e2bb5fdc26907039428d4fa5cc5b7b8b3192f35851f5a7542b2d4" gracePeriod=2 Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.301486 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-rwxpq" Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.435036 5200 generic.go:358] "Generic (PLEG): container finished" podID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerID="3e60d37e8e4e2bb5fdc26907039428d4fa5cc5b7b8b3192f35851f5a7542b2d4" exitCode=0 Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.436005 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7svv7" event={"ID":"487bcfe5-a043-4cf8-a7d9-1490ae121148","Type":"ContainerDied","Data":"3e60d37e8e4e2bb5fdc26907039428d4fa5cc5b7b8b3192f35851f5a7542b2d4"} Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.436045 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7svv7" event={"ID":"487bcfe5-a043-4cf8-a7d9-1490ae121148","Type":"ContainerDied","Data":"f2bd1df99b31cdce6b75e9236f6a461b83706597f6c9c640b1a9069c05b6e200"} Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.436056 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2bd1df99b31cdce6b75e9236f6a461b83706597f6c9c640b1a9069c05b6e200" Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.440225 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.583419 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-utilities\") pod \"487bcfe5-a043-4cf8-a7d9-1490ae121148\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.584172 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqlxz\" (UniqueName: \"kubernetes.io/projected/487bcfe5-a043-4cf8-a7d9-1490ae121148-kube-api-access-cqlxz\") pod \"487bcfe5-a043-4cf8-a7d9-1490ae121148\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.584368 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-catalog-content\") pod \"487bcfe5-a043-4cf8-a7d9-1490ae121148\" (UID: \"487bcfe5-a043-4cf8-a7d9-1490ae121148\") " Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.584830 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-utilities" (OuterVolumeSpecName: "utilities") pod "487bcfe5-a043-4cf8-a7d9-1490ae121148" (UID: "487bcfe5-a043-4cf8-a7d9-1490ae121148"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.585754 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.593391 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487bcfe5-a043-4cf8-a7d9-1490ae121148-kube-api-access-cqlxz" (OuterVolumeSpecName: "kube-api-access-cqlxz") pod "487bcfe5-a043-4cf8-a7d9-1490ae121148" (UID: "487bcfe5-a043-4cf8-a7d9-1490ae121148"). InnerVolumeSpecName "kube-api-access-cqlxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:09:58 crc kubenswrapper[5200]: I1126 15:09:58.688350 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqlxz\" (UniqueName: \"kubernetes.io/projected/487bcfe5-a043-4cf8-a7d9-1490ae121148-kube-api-access-cqlxz\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:59 crc kubenswrapper[5200]: I1126 15:09:59.100445 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "487bcfe5-a043-4cf8-a7d9-1490ae121148" (UID: "487bcfe5-a043-4cf8-a7d9-1490ae121148"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:09:59 crc kubenswrapper[5200]: I1126 15:09:59.201417 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487bcfe5-a043-4cf8-a7d9-1490ae121148-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:59 crc kubenswrapper[5200]: I1126 15:09:59.447389 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7svv7" Nov 26 15:09:59 crc kubenswrapper[5200]: I1126 15:09:59.495238 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7svv7"] Nov 26 15:09:59 crc kubenswrapper[5200]: I1126 15:09:59.508885 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7svv7"] Nov 26 15:10:00 crc kubenswrapper[5200]: I1126 15:10:00.981377 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" path="/var/lib/kubelet/pods/487bcfe5-a043-4cf8-a7d9-1490ae121148/volumes" Nov 26 15:10:08 crc kubenswrapper[5200]: I1126 15:10:08.970109 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:10:08 crc kubenswrapper[5200]: E1126 15:10:08.971305 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:10:11 crc kubenswrapper[5200]: I1126 15:10:11.046961 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-l4zh7"] Nov 26 15:10:11 crc kubenswrapper[5200]: I1126 15:10:11.064670 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-af71-account-create-update-7rx84"] Nov 26 15:10:11 crc kubenswrapper[5200]: I1126 15:10:11.074792 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-l4zh7"] Nov 26 15:10:11 crc kubenswrapper[5200]: I1126 15:10:11.085480 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-af71-account-create-update-7rx84"] Nov 26 15:10:12 crc kubenswrapper[5200]: I1126 15:10:12.990651 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d16627b-f04e-40f8-97fd-66dfef695d5c" path="/var/lib/kubelet/pods/0d16627b-f04e-40f8-97fd-66dfef695d5c/volumes" Nov 26 15:10:12 crc kubenswrapper[5200]: I1126 15:10:12.991551 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24e538b2-ca3f-48bf-85ad-abf37d2cf9ab" path="/var/lib/kubelet/pods/24e538b2-ca3f-48bf-85ad-abf37d2cf9ab/volumes" Nov 26 15:10:17 crc kubenswrapper[5200]: I1126 15:10:17.043352 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9q58q"] Nov 26 15:10:17 crc kubenswrapper[5200]: I1126 15:10:17.057909 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9q58q"] Nov 26 15:10:18 crc kubenswrapper[5200]: I1126 15:10:18.983859 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3159e145-f5bf-41a3-aa10-87c4725e3781" path="/var/lib/kubelet/pods/3159e145-f5bf-41a3-aa10-87c4725e3781/volumes" Nov 26 15:10:22 crc kubenswrapper[5200]: I1126 15:10:22.981035 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:10:22 crc kubenswrapper[5200]: E1126 15:10:22.981988 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:10:29 crc kubenswrapper[5200]: I1126 15:10:29.319714 5200 scope.go:117] "RemoveContainer" containerID="c5ae197d5ceda2edc06e34cd13d2bb3b53ee908f900c3e2ad32e73f850242229" Nov 26 15:10:29 crc kubenswrapper[5200]: I1126 15:10:29.382891 5200 scope.go:117] "RemoveContainer" containerID="d0eafa9dde6fa57957feab076a6f2acfe739b429e4158b21749fd8a32a3f64e3" Nov 26 15:10:29 crc kubenswrapper[5200]: I1126 15:10:29.421961 5200 scope.go:117] "RemoveContainer" containerID="af1c3d825cf6c6bf07f06626bd0e2c0992c9015d47cb86a3a060d24b55fac3a6" Nov 26 15:10:29 crc kubenswrapper[5200]: I1126 15:10:29.469789 5200 scope.go:117] "RemoveContainer" containerID="791821d66e1f8e8b70c4935a2399a576d89835d7b09ab5e09ba7317f1aec41d1" Nov 26 15:10:33 crc kubenswrapper[5200]: I1126 15:10:33.974616 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:10:33 crc kubenswrapper[5200]: E1126 15:10:33.976113 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:10:37 crc kubenswrapper[5200]: I1126 15:10:37.547112 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:10:37 crc kubenswrapper[5200]: I1126 15:10:37.547174 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:10:37 crc kubenswrapper[5200]: I1126 15:10:37.553404 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:10:37 crc kubenswrapper[5200]: I1126 15:10:37.553441 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:10:42 crc kubenswrapper[5200]: E1126 15:10:42.433129 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2009478 actualBytes=10240 Nov 26 15:10:43 crc kubenswrapper[5200]: I1126 15:10:43.065347 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tnfxv"] Nov 26 15:10:43 crc kubenswrapper[5200]: I1126 15:10:43.081204 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tnfxv"] Nov 26 15:10:44 crc kubenswrapper[5200]: I1126 15:10:44.056720 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2f2d-account-create-update-rdf8p"] Nov 26 15:10:44 crc kubenswrapper[5200]: I1126 15:10:44.069567 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2f2d-account-create-update-rdf8p"] Nov 26 15:10:44 crc kubenswrapper[5200]: I1126 15:10:44.979901 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b41a86-1404-4554-9b24-7c8eaa6f0ac9" path="/var/lib/kubelet/pods/28b41a86-1404-4554-9b24-7c8eaa6f0ac9/volumes" Nov 26 15:10:45 crc kubenswrapper[5200]: I1126 15:10:45.100289 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b22c5306-1d48-4995-83ca-b6fc66bd3b1f" path="/var/lib/kubelet/pods/b22c5306-1d48-4995-83ca-b6fc66bd3b1f/volumes" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.520241 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-86dccf9c9c-th5p7"] Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.524856 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerName="extract-utilities" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.524882 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerName="extract-utilities" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.524910 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerName="extract-content" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.524916 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerName="extract-content" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.524933 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerName="registry-server" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.524938 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerName="registry-server" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.525142 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="487bcfe5-a043-4cf8-a7d9-1490ae121148" containerName="registry-server" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.655492 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86dccf9c9c-th5p7"] Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.655528 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.655550 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-547f477b6f-tzg2z"] Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.655904 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" containerName="glance-log" containerID="cri-o://445e7507e24f17f34ab09694d26415b94e9f7462eb438ca397d5eb12945390b2" gracePeriod=30 Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.655934 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.655954 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" containerName="glance-httpd" containerID="cri-o://e9c3c7d7e22346be3b1b68caacb77d4a03db0afbbbec4b9af4454e54d843d71a" gracePeriod=30 Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.664440 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"horizon-config-data\"" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.664662 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"horizon\"" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.664832 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"horizon-scripts\"" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.666272 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"horizon-horizon-dockercfg-ghmst\"" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.710013 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.710053 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-547f477b6f-tzg2z"] Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.710282 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerName="glance-log" containerID="cri-o://754d6612e9d989ffc241aa4b07505878a90e8f5e11bd3b22482da80f4bff2ca9" gracePeriod=30 Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.710318 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.710673 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerName="glance-httpd" containerID="cri-o://dd9951a3e60274cc10eac62488feca37bbf6967de71549cba94f5e87bc4c60d0" gracePeriod=30 Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.729112 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-scripts\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.729177 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-scripts\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.729274 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91872da6-36e7-41f3-946e-160f8a255e28-horizon-secret-key\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.729403 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-config-data\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.729601 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcgxd\" (UniqueName: \"kubernetes.io/projected/29864ea7-0c2b-46ae-aa60-93b6a8318380-kube-api-access-fcgxd\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.730311 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29864ea7-0c2b-46ae-aa60-93b6a8318380-horizon-secret-key\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.730395 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-config-data\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.730475 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp6nz\" (UniqueName: \"kubernetes.io/projected/91872da6-36e7-41f3-946e-160f8a255e28-kube-api-access-wp6nz\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.730518 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91872da6-36e7-41f3-946e-160f8a255e28-logs\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.730605 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29864ea7-0c2b-46ae-aa60-93b6a8318380-logs\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832315 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcgxd\" (UniqueName: \"kubernetes.io/projected/29864ea7-0c2b-46ae-aa60-93b6a8318380-kube-api-access-fcgxd\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832675 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29864ea7-0c2b-46ae-aa60-93b6a8318380-horizon-secret-key\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832736 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-config-data\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832785 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp6nz\" (UniqueName: \"kubernetes.io/projected/91872da6-36e7-41f3-946e-160f8a255e28-kube-api-access-wp6nz\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832819 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91872da6-36e7-41f3-946e-160f8a255e28-logs\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832866 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29864ea7-0c2b-46ae-aa60-93b6a8318380-logs\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832906 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-scripts\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832957 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-scripts\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.832992 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91872da6-36e7-41f3-946e-160f8a255e28-horizon-secret-key\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.833060 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-config-data\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.834129 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-scripts\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.834487 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91872da6-36e7-41f3-946e-160f8a255e28-logs\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.834641 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-config-data\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.834659 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-scripts\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.835213 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-config-data\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.835375 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29864ea7-0c2b-46ae-aa60-93b6a8318380-logs\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.840988 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91872da6-36e7-41f3-946e-160f8a255e28-horizon-secret-key\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.840976 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29864ea7-0c2b-46ae-aa60-93b6a8318380-horizon-secret-key\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.852515 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcgxd\" (UniqueName: \"kubernetes.io/projected/29864ea7-0c2b-46ae-aa60-93b6a8318380-kube-api-access-fcgxd\") pod \"horizon-86dccf9c9c-th5p7\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.854409 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp6nz\" (UniqueName: \"kubernetes.io/projected/91872da6-36e7-41f3-946e-160f8a255e28-kube-api-access-wp6nz\") pod \"horizon-547f477b6f-tzg2z\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.969679 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:10:47 crc kubenswrapper[5200]: E1126 15:10:47.970171 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:10:47 crc kubenswrapper[5200]: I1126 15:10:47.986519 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.040813 5200 generic.go:358] "Generic (PLEG): container finished" podID="3726df62-5cbd-44e0-af53-6097e8a99089" containerID="445e7507e24f17f34ab09694d26415b94e9f7462eb438ca397d5eb12945390b2" exitCode=143 Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.041418 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3726df62-5cbd-44e0-af53-6097e8a99089","Type":"ContainerDied","Data":"445e7507e24f17f34ab09694d26415b94e9f7462eb438ca397d5eb12945390b2"} Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.045195 5200 generic.go:358] "Generic (PLEG): container finished" podID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerID="754d6612e9d989ffc241aa4b07505878a90e8f5e11bd3b22482da80f4bff2ca9" exitCode=143 Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.045379 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a9ab2ad-8c80-4821-8223-346343d1c4a3","Type":"ContainerDied","Data":"754d6612e9d989ffc241aa4b07505878a90e8f5e11bd3b22482da80f4bff2ca9"} Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.049086 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.234091 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-547f477b6f-tzg2z"] Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.269318 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fd9587865-d7gsp"] Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.335804 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fd9587865-d7gsp"] Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.335941 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.451288 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0a1542c-d2dd-4963-9394-58e52fe057ff-horizon-secret-key\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.451384 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28vvd\" (UniqueName: \"kubernetes.io/projected/c0a1542c-d2dd-4963-9394-58e52fe057ff-kube-api-access-28vvd\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.451424 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a1542c-d2dd-4963-9394-58e52fe057ff-logs\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.451549 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-config-data\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.451620 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-scripts\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.547337 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86dccf9c9c-th5p7"] Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.551288 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.554181 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-scripts\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.554922 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0a1542c-d2dd-4963-9394-58e52fe057ff-horizon-secret-key\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.555031 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-scripts\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.555038 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28vvd\" (UniqueName: \"kubernetes.io/projected/c0a1542c-d2dd-4963-9394-58e52fe057ff-kube-api-access-28vvd\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.555246 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a1542c-d2dd-4963-9394-58e52fe057ff-logs\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.555575 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-config-data\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.555793 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a1542c-d2dd-4963-9394-58e52fe057ff-logs\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.556743 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-config-data\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.560967 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0a1542c-d2dd-4963-9394-58e52fe057ff-horizon-secret-key\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.571411 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28vvd\" (UniqueName: \"kubernetes.io/projected/c0a1542c-d2dd-4963-9394-58e52fe057ff-kube-api-access-28vvd\") pod \"horizon-6fd9587865-d7gsp\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.676716 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-547f477b6f-tzg2z"] Nov 26 15:10:48 crc kubenswrapper[5200]: I1126 15:10:48.678142 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:49 crc kubenswrapper[5200]: I1126 15:10:49.071232 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dccf9c9c-th5p7" event={"ID":"29864ea7-0c2b-46ae-aa60-93b6a8318380","Type":"ContainerStarted","Data":"9df56eb9f0ae3afb1d007dbc3a3001ab88093aba19f8d3a3be8db4652eb60a19"} Nov 26 15:10:49 crc kubenswrapper[5200]: I1126 15:10:49.077419 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547f477b6f-tzg2z" event={"ID":"91872da6-36e7-41f3-946e-160f8a255e28","Type":"ContainerStarted","Data":"bf5cdf9e6a99860503ac34ec0613fa8b76cc4f3a80a636a52162fedc80568f39"} Nov 26 15:10:49 crc kubenswrapper[5200]: I1126 15:10:49.219871 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fd9587865-d7gsp"] Nov 26 15:10:50 crc kubenswrapper[5200]: I1126 15:10:50.090572 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fd9587865-d7gsp" event={"ID":"c0a1542c-d2dd-4963-9394-58e52fe057ff","Type":"ContainerStarted","Data":"526159dbe93e58255bb1951d1631c9c732cf66b763db0f3b0e6bc439ef62161d"} Nov 26 15:10:51 crc kubenswrapper[5200]: I1126 15:10:51.102989 5200 generic.go:358] "Generic (PLEG): container finished" podID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerID="dd9951a3e60274cc10eac62488feca37bbf6967de71549cba94f5e87bc4c60d0" exitCode=0 Nov 26 15:10:51 crc kubenswrapper[5200]: I1126 15:10:51.103079 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a9ab2ad-8c80-4821-8223-346343d1c4a3","Type":"ContainerDied","Data":"dd9951a3e60274cc10eac62488feca37bbf6967de71549cba94f5e87bc4c60d0"} Nov 26 15:10:51 crc kubenswrapper[5200]: I1126 15:10:51.106307 5200 generic.go:358] "Generic (PLEG): container finished" podID="3726df62-5cbd-44e0-af53-6097e8a99089" containerID="e9c3c7d7e22346be3b1b68caacb77d4a03db0afbbbec4b9af4454e54d843d71a" exitCode=0 Nov 26 15:10:51 crc kubenswrapper[5200]: I1126 15:10:51.106391 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3726df62-5cbd-44e0-af53-6097e8a99089","Type":"ContainerDied","Data":"e9c3c7d7e22346be3b1b68caacb77d4a03db0afbbbec4b9af4454e54d843d71a"} Nov 26 15:10:52 crc kubenswrapper[5200]: I1126 15:10:52.059623 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wm5bt"] Nov 26 15:10:52 crc kubenswrapper[5200]: I1126 15:10:52.070887 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wm5bt"] Nov 26 15:10:52 crc kubenswrapper[5200]: I1126 15:10:52.983022 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ab3495-0e54-4181-a051-e292abef15a9" path="/var/lib/kubelet/pods/64ab3495-0e54-4181-a051-e292abef15a9/volumes" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.687528 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.692985 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858144 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-config-data\") pod \"3726df62-5cbd-44e0-af53-6097e8a99089\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858217 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87v8r\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-kube-api-access-87v8r\") pod \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858251 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-combined-ca-bundle\") pod \"3726df62-5cbd-44e0-af53-6097e8a99089\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858339 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-ceph\") pod \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858419 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-logs\") pod \"3726df62-5cbd-44e0-af53-6097e8a99089\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858584 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-httpd-run\") pod \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858674 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-scripts\") pod \"3726df62-5cbd-44e0-af53-6097e8a99089\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858810 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-httpd-run\") pod \"3726df62-5cbd-44e0-af53-6097e8a99089\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858884 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-combined-ca-bundle\") pod \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.858973 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-scripts\") pod \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.859078 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-logs\") pod \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.859898 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lmj8\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-kube-api-access-7lmj8\") pod \"3726df62-5cbd-44e0-af53-6097e8a99089\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.859982 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-ceph\") pod \"3726df62-5cbd-44e0-af53-6097e8a99089\" (UID: \"3726df62-5cbd-44e0-af53-6097e8a99089\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.859981 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-logs" (OuterVolumeSpecName: "logs") pod "3726df62-5cbd-44e0-af53-6097e8a99089" (UID: "3726df62-5cbd-44e0-af53-6097e8a99089"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.860135 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-config-data\") pod \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\" (UID: \"0a9ab2ad-8c80-4821-8223-346343d1c4a3\") " Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.860202 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0a9ab2ad-8c80-4821-8223-346343d1c4a3" (UID: "0a9ab2ad-8c80-4821-8223-346343d1c4a3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.861495 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3726df62-5cbd-44e0-af53-6097e8a99089" (UID: "3726df62-5cbd-44e0-af53-6097e8a99089"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.862399 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-logs" (OuterVolumeSpecName: "logs") pod "0a9ab2ad-8c80-4821-8223-346343d1c4a3" (UID: "0a9ab2ad-8c80-4821-8223-346343d1c4a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.862917 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.862935 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.862947 5200 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3726df62-5cbd-44e0-af53-6097e8a99089-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.862956 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a9ab2ad-8c80-4821-8223-346343d1c4a3-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.868397 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-ceph" (OuterVolumeSpecName: "ceph") pod "3726df62-5cbd-44e0-af53-6097e8a99089" (UID: "3726df62-5cbd-44e0-af53-6097e8a99089"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.868673 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-kube-api-access-87v8r" (OuterVolumeSpecName: "kube-api-access-87v8r") pod "0a9ab2ad-8c80-4821-8223-346343d1c4a3" (UID: "0a9ab2ad-8c80-4821-8223-346343d1c4a3"). InnerVolumeSpecName "kube-api-access-87v8r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.868892 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-scripts" (OuterVolumeSpecName: "scripts") pod "3726df62-5cbd-44e0-af53-6097e8a99089" (UID: "3726df62-5cbd-44e0-af53-6097e8a99089"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.870956 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-scripts" (OuterVolumeSpecName: "scripts") pod "0a9ab2ad-8c80-4821-8223-346343d1c4a3" (UID: "0a9ab2ad-8c80-4821-8223-346343d1c4a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.877839 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-ceph" (OuterVolumeSpecName: "ceph") pod "0a9ab2ad-8c80-4821-8223-346343d1c4a3" (UID: "0a9ab2ad-8c80-4821-8223-346343d1c4a3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.883150 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-kube-api-access-7lmj8" (OuterVolumeSpecName: "kube-api-access-7lmj8") pod "3726df62-5cbd-44e0-af53-6097e8a99089" (UID: "3726df62-5cbd-44e0-af53-6097e8a99089"). InnerVolumeSpecName "kube-api-access-7lmj8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.904336 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3726df62-5cbd-44e0-af53-6097e8a99089" (UID: "3726df62-5cbd-44e0-af53-6097e8a99089"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.915969 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a9ab2ad-8c80-4821-8223-346343d1c4a3" (UID: "0a9ab2ad-8c80-4821-8223-346343d1c4a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.949770 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-config-data" (OuterVolumeSpecName: "config-data") pod "3726df62-5cbd-44e0-af53-6097e8a99089" (UID: "3726df62-5cbd-44e0-af53-6097e8a99089"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.952867 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-config-data" (OuterVolumeSpecName: "config-data") pod "0a9ab2ad-8c80-4821-8223-346343d1c4a3" (UID: "0a9ab2ad-8c80-4821-8223-346343d1c4a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.968750 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.968961 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.969000 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7lmj8\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-kube-api-access-7lmj8\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.969016 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3726df62-5cbd-44e0-af53-6097e8a99089-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.969062 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ab2ad-8c80-4821-8223-346343d1c4a3-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.969078 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.969110 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87v8r\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-kube-api-access-87v8r\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.969139 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.969153 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a9ab2ad-8c80-4821-8223-346343d1c4a3-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:56 crc kubenswrapper[5200]: I1126 15:10:56.969164 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3726df62-5cbd-44e0-af53-6097e8a99089-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.183503 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a9ab2ad-8c80-4821-8223-346343d1c4a3","Type":"ContainerDied","Data":"f26e2a8d8c379cd862663920769bf2c2efd554b6e0fa634a59688bf738e16629"} Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.183570 5200 scope.go:117] "RemoveContainer" containerID="dd9951a3e60274cc10eac62488feca37bbf6967de71549cba94f5e87bc4c60d0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.183515 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.186025 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3726df62-5cbd-44e0-af53-6097e8a99089","Type":"ContainerDied","Data":"bbdf79a96d18c1a1a0cc50d41d2748a803f9ad64dfbbf23fcc76707fe533e781"} Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.186109 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.262842 5200 scope.go:117] "RemoveContainer" containerID="754d6612e9d989ffc241aa4b07505878a90e8f5e11bd3b22482da80f4bff2ca9" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.263369 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.305581 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.324132 5200 scope.go:117] "RemoveContainer" containerID="e9c3c7d7e22346be3b1b68caacb77d4a03db0afbbbec4b9af4454e54d843d71a" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.332374 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.359495 5200 scope.go:117] "RemoveContainer" containerID="445e7507e24f17f34ab09694d26415b94e9f7462eb438ca397d5eb12945390b2" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.361840 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363261 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" containerName="glance-httpd" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363286 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" containerName="glance-httpd" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363307 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerName="glance-httpd" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363312 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerName="glance-httpd" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363341 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerName="glance-log" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363346 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerName="glance-log" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363358 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" containerName="glance-log" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363363 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" containerName="glance-log" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363541 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" containerName="glance-httpd" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363552 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerName="glance-log" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363569 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" containerName="glance-httpd" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.363580 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" containerName="glance-log" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.374123 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.377078 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-qwlg9\"" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.377521 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.377609 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.377659 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-scripts\"" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.383982 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.384106 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa79542b-64a8-44dc-b8fa-131c110cf1ab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.384141 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.384216 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa79542b-64a8-44dc-b8fa-131c110cf1ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.384258 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz65t\" (UniqueName: \"kubernetes.io/projected/aa79542b-64a8-44dc-b8fa-131c110cf1ab-kube-api-access-gz65t\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.384318 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.384390 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa79542b-64a8-44dc-b8fa-131c110cf1ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.399957 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.417195 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.426253 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.428039 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.428909 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.487841 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa79542b-64a8-44dc-b8fa-131c110cf1ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.488265 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.488399 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a200f8e7-0290-47f8-8d77-769ebaa23422-logs\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.489189 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa79542b-64a8-44dc-b8fa-131c110cf1ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.489366 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-scripts\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.489510 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-config-data\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.489621 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7qv\" (UniqueName: \"kubernetes.io/projected/a200f8e7-0290-47f8-8d77-769ebaa23422-kube-api-access-2p7qv\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.489779 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa79542b-64a8-44dc-b8fa-131c110cf1ab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.489914 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.490008 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a200f8e7-0290-47f8-8d77-769ebaa23422-ceph\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.490807 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa79542b-64a8-44dc-b8fa-131c110cf1ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.490995 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz65t\" (UniqueName: \"kubernetes.io/projected/aa79542b-64a8-44dc-b8fa-131c110cf1ab-kube-api-access-gz65t\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.491149 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.491357 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a200f8e7-0290-47f8-8d77-769ebaa23422-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.491440 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.492240 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa79542b-64a8-44dc-b8fa-131c110cf1ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.493564 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.495566 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.503419 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa79542b-64a8-44dc-b8fa-131c110cf1ab-ceph\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.507828 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa79542b-64a8-44dc-b8fa-131c110cf1ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.509423 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz65t\" (UniqueName: \"kubernetes.io/projected/aa79542b-64a8-44dc-b8fa-131c110cf1ab-kube-api-access-gz65t\") pod \"glance-default-internal-api-0\" (UID: \"aa79542b-64a8-44dc-b8fa-131c110cf1ab\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.594127 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.594182 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a200f8e7-0290-47f8-8d77-769ebaa23422-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.594273 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a200f8e7-0290-47f8-8d77-769ebaa23422-logs\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.594307 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-scripts\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.594335 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-config-data\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.594362 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7qv\" (UniqueName: \"kubernetes.io/projected/a200f8e7-0290-47f8-8d77-769ebaa23422-kube-api-access-2p7qv\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.594398 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a200f8e7-0290-47f8-8d77-769ebaa23422-ceph\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.595589 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a200f8e7-0290-47f8-8d77-769ebaa23422-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.595604 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a200f8e7-0290-47f8-8d77-769ebaa23422-logs\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.600896 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a200f8e7-0290-47f8-8d77-769ebaa23422-ceph\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.601354 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.602145 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-config-data\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.602196 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a200f8e7-0290-47f8-8d77-769ebaa23422-scripts\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.613033 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7qv\" (UniqueName: \"kubernetes.io/projected/a200f8e7-0290-47f8-8d77-769ebaa23422-kube-api-access-2p7qv\") pod \"glance-default-external-api-0\" (UID: \"a200f8e7-0290-47f8-8d77-769ebaa23422\") " pod="openstack/glance-default-external-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.712318 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:10:57 crc kubenswrapper[5200]: I1126 15:10:57.762509 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.204705 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dccf9c9c-th5p7" event={"ID":"29864ea7-0c2b-46ae-aa60-93b6a8318380","Type":"ContainerStarted","Data":"227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087"} Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.205139 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dccf9c9c-th5p7" event={"ID":"29864ea7-0c2b-46ae-aa60-93b6a8318380","Type":"ContainerStarted","Data":"4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e"} Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.209200 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fd9587865-d7gsp" event={"ID":"c0a1542c-d2dd-4963-9394-58e52fe057ff","Type":"ContainerStarted","Data":"16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89"} Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.209253 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fd9587865-d7gsp" event={"ID":"c0a1542c-d2dd-4963-9394-58e52fe057ff","Type":"ContainerStarted","Data":"3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9"} Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.214921 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547f477b6f-tzg2z" event={"ID":"91872da6-36e7-41f3-946e-160f8a255e28","Type":"ContainerStarted","Data":"10f41548671855c11947a05f4a9a9bcae419325fd607745d8080478b44dd5c00"} Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.215229 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547f477b6f-tzg2z" event={"ID":"91872da6-36e7-41f3-946e-160f8a255e28","Type":"ContainerStarted","Data":"34abfa6d0570f774fe8e957708178d31f9a5b609d93d963d68bd86470dd6b05a"} Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.215043 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-547f477b6f-tzg2z" podUID="91872da6-36e7-41f3-946e-160f8a255e28" containerName="horizon" containerID="cri-o://10f41548671855c11947a05f4a9a9bcae419325fd607745d8080478b44dd5c00" gracePeriod=30 Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.215023 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-547f477b6f-tzg2z" podUID="91872da6-36e7-41f3-946e-160f8a255e28" containerName="horizon-log" containerID="cri-o://34abfa6d0570f774fe8e957708178d31f9a5b609d93d963d68bd86470dd6b05a" gracePeriod=30 Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.240882 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86dccf9c9c-th5p7" podStartSLOduration=2.7975929219999998 podStartE2EDuration="11.240849626s" podCreationTimestamp="2025-11-26 15:10:47 +0000 UTC" firstStartedPulling="2025-11-26 15:10:48.555866744 +0000 UTC m=+6017.235068562" lastFinishedPulling="2025-11-26 15:10:56.999123448 +0000 UTC m=+6025.678325266" observedRunningTime="2025-11-26 15:10:58.230935893 +0000 UTC m=+6026.910137711" watchObservedRunningTime="2025-11-26 15:10:58.240849626 +0000 UTC m=+6026.920051454" Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.272440 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-547f477b6f-tzg2z" podStartSLOduration=2.952756416 podStartE2EDuration="11.272406815s" podCreationTimestamp="2025-11-26 15:10:47 +0000 UTC" firstStartedPulling="2025-11-26 15:10:48.704620897 +0000 UTC m=+6017.383822715" lastFinishedPulling="2025-11-26 15:10:57.024271296 +0000 UTC m=+6025.703473114" observedRunningTime="2025-11-26 15:10:58.255124756 +0000 UTC m=+6026.934326594" watchObservedRunningTime="2025-11-26 15:10:58.272406815 +0000 UTC m=+6026.951608633" Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.291835 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fd9587865-d7gsp" podStartSLOduration=2.425055618 podStartE2EDuration="10.291810211s" podCreationTimestamp="2025-11-26 15:10:48 +0000 UTC" firstStartedPulling="2025-11-26 15:10:49.226723932 +0000 UTC m=+6017.905925750" lastFinishedPulling="2025-11-26 15:10:57.093478525 +0000 UTC m=+6025.772680343" observedRunningTime="2025-11-26 15:10:58.277579892 +0000 UTC m=+6026.956781720" watchObservedRunningTime="2025-11-26 15:10:58.291810211 +0000 UTC m=+6026.971012029" Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.361210 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:10:58 crc kubenswrapper[5200]: W1126 15:10:58.370197 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa79542b_64a8_44dc_b8fa_131c110cf1ab.slice/crio-dda52cb2956d95e02874d21fbbd80329ef273a1081e176ab8829db9deae3172d WatchSource:0}: Error finding container dda52cb2956d95e02874d21fbbd80329ef273a1081e176ab8829db9deae3172d: Status 404 returned error can't find the container with id dda52cb2956d95e02874d21fbbd80329ef273a1081e176ab8829db9deae3172d Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.448660 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.679459 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.679516 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.971862 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:10:58 crc kubenswrapper[5200]: E1126 15:10:58.972490 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.987042 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9ab2ad-8c80-4821-8223-346343d1c4a3" path="/var/lib/kubelet/pods/0a9ab2ad-8c80-4821-8223-346343d1c4a3/volumes" Nov 26 15:10:58 crc kubenswrapper[5200]: I1126 15:10:58.995340 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3726df62-5cbd-44e0-af53-6097e8a99089" path="/var/lib/kubelet/pods/3726df62-5cbd-44e0-af53-6097e8a99089/volumes" Nov 26 15:10:59 crc kubenswrapper[5200]: I1126 15:10:59.231330 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa79542b-64a8-44dc-b8fa-131c110cf1ab","Type":"ContainerStarted","Data":"dda52cb2956d95e02874d21fbbd80329ef273a1081e176ab8829db9deae3172d"} Nov 26 15:10:59 crc kubenswrapper[5200]: I1126 15:10:59.236398 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a200f8e7-0290-47f8-8d77-769ebaa23422","Type":"ContainerStarted","Data":"7234699fb6c6f2005b0c3d05c1434232bf51a139bbed75b4889b477f54856365"} Nov 26 15:11:00 crc kubenswrapper[5200]: I1126 15:11:00.249513 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa79542b-64a8-44dc-b8fa-131c110cf1ab","Type":"ContainerStarted","Data":"23132f556491b0863ac414ba657e8e7f657be92b2e254abebb7caf9bf79b3c1f"} Nov 26 15:11:00 crc kubenswrapper[5200]: I1126 15:11:00.249952 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa79542b-64a8-44dc-b8fa-131c110cf1ab","Type":"ContainerStarted","Data":"8ca74d93a2b9e413f7a51cbe96092a9ad9daae15911e9136359d848ef93a1a10"} Nov 26 15:11:00 crc kubenswrapper[5200]: I1126 15:11:00.253672 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a200f8e7-0290-47f8-8d77-769ebaa23422","Type":"ContainerStarted","Data":"27f929dc7dabc4cf4f68d9df2db8e7c0eb0d11446fafc7773be99266a50666a2"} Nov 26 15:11:01 crc kubenswrapper[5200]: I1126 15:11:01.269536 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a200f8e7-0290-47f8-8d77-769ebaa23422","Type":"ContainerStarted","Data":"535fc2abebc7ec19053772b6d6914a8ee00809eacefd6e109416d0b2a20d68a4"} Nov 26 15:11:01 crc kubenswrapper[5200]: I1126 15:11:01.306590 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.30657032 podStartE2EDuration="4.30657032s" podCreationTimestamp="2025-11-26 15:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:11:01.298404793 +0000 UTC m=+6029.977606611" watchObservedRunningTime="2025-11-26 15:11:01.30657032 +0000 UTC m=+6029.985772148" Nov 26 15:11:01 crc kubenswrapper[5200]: I1126 15:11:01.328074 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.328051541 podStartE2EDuration="4.328051541s" podCreationTimestamp="2025-11-26 15:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:11:01.321401814 +0000 UTC m=+6030.000603632" watchObservedRunningTime="2025-11-26 15:11:01.328051541 +0000 UTC m=+6030.007253359" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.715349 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.715960 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.749834 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.761765 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.764089 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.764137 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.810562 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.811174 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.987614 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.987681 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:11:07 crc kubenswrapper[5200]: I1126 15:11:07.991669 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/horizon-86dccf9c9c-th5p7" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 26 15:11:08 crc kubenswrapper[5200]: I1126 15:11:08.049668 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:11:08 crc kubenswrapper[5200]: I1126 15:11:08.344540 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Nov 26 15:11:08 crc kubenswrapper[5200]: I1126 15:11:08.344577 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Nov 26 15:11:08 crc kubenswrapper[5200]: I1126 15:11:08.344586 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:11:08 crc kubenswrapper[5200]: I1126 15:11:08.344594 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:11:08 crc kubenswrapper[5200]: I1126 15:11:08.682643 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/horizon-6fd9587865-d7gsp" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 26 15:11:09 crc kubenswrapper[5200]: I1126 15:11:09.970564 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:11:09 crc kubenswrapper[5200]: E1126 15:11:09.971738 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:11:10 crc kubenswrapper[5200]: I1126 15:11:10.995483 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:11:10 crc kubenswrapper[5200]: I1126 15:11:10.996962 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:11:11 crc kubenswrapper[5200]: I1126 15:11:11.061519 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:11:11 crc kubenswrapper[5200]: I1126 15:11:11.116281 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 15:11:11 crc kubenswrapper[5200]: I1126 15:11:11.116733 5200 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:11:11 crc kubenswrapper[5200]: I1126 15:11:11.147658 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 15:11:18 crc kubenswrapper[5200]: I1126 15:11:18.680280 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/horizon-6fd9587865-d7gsp" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 26 15:11:20 crc kubenswrapper[5200]: I1126 15:11:20.136577 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:11:21 crc kubenswrapper[5200]: I1126 15:11:21.657822 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:11:23 crc kubenswrapper[5200]: I1126 15:11:23.970064 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:11:23 crc kubenswrapper[5200]: E1126 15:11:23.971222 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:11:28 crc kubenswrapper[5200]: I1126 15:11:28.586993 5200 generic.go:358] "Generic (PLEG): container finished" podID="91872da6-36e7-41f3-946e-160f8a255e28" containerID="10f41548671855c11947a05f4a9a9bcae419325fd607745d8080478b44dd5c00" exitCode=137 Nov 26 15:11:28 crc kubenswrapper[5200]: I1126 15:11:28.587572 5200 generic.go:358] "Generic (PLEG): container finished" podID="91872da6-36e7-41f3-946e-160f8a255e28" containerID="34abfa6d0570f774fe8e957708178d31f9a5b609d93d963d68bd86470dd6b05a" exitCode=137 Nov 26 15:11:28 crc kubenswrapper[5200]: I1126 15:11:28.587179 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547f477b6f-tzg2z" event={"ID":"91872da6-36e7-41f3-946e-160f8a255e28","Type":"ContainerDied","Data":"10f41548671855c11947a05f4a9a9bcae419325fd607745d8080478b44dd5c00"} Nov 26 15:11:28 crc kubenswrapper[5200]: I1126 15:11:28.587820 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547f477b6f-tzg2z" event={"ID":"91872da6-36e7-41f3-946e-160f8a255e28","Type":"ContainerDied","Data":"34abfa6d0570f774fe8e957708178d31f9a5b609d93d963d68bd86470dd6b05a"} Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.364089 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.450596 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp6nz\" (UniqueName: \"kubernetes.io/projected/91872da6-36e7-41f3-946e-160f8a255e28-kube-api-access-wp6nz\") pod \"91872da6-36e7-41f3-946e-160f8a255e28\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.450826 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91872da6-36e7-41f3-946e-160f8a255e28-horizon-secret-key\") pod \"91872da6-36e7-41f3-946e-160f8a255e28\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.450998 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-config-data\") pod \"91872da6-36e7-41f3-946e-160f8a255e28\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.451110 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91872da6-36e7-41f3-946e-160f8a255e28-logs\") pod \"91872da6-36e7-41f3-946e-160f8a255e28\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.451211 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-scripts\") pod \"91872da6-36e7-41f3-946e-160f8a255e28\" (UID: \"91872da6-36e7-41f3-946e-160f8a255e28\") " Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.453336 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91872da6-36e7-41f3-946e-160f8a255e28-logs" (OuterVolumeSpecName: "logs") pod "91872da6-36e7-41f3-946e-160f8a255e28" (UID: "91872da6-36e7-41f3-946e-160f8a255e28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.458027 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91872da6-36e7-41f3-946e-160f8a255e28-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "91872da6-36e7-41f3-946e-160f8a255e28" (UID: "91872da6-36e7-41f3-946e-160f8a255e28"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.460352 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91872da6-36e7-41f3-946e-160f8a255e28-kube-api-access-wp6nz" (OuterVolumeSpecName: "kube-api-access-wp6nz") pod "91872da6-36e7-41f3-946e-160f8a255e28" (UID: "91872da6-36e7-41f3-946e-160f8a255e28"). InnerVolumeSpecName "kube-api-access-wp6nz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.484546 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-config-data" (OuterVolumeSpecName: "config-data") pod "91872da6-36e7-41f3-946e-160f8a255e28" (UID: "91872da6-36e7-41f3-946e-160f8a255e28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.499785 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-scripts" (OuterVolumeSpecName: "scripts") pod "91872da6-36e7-41f3-946e-160f8a255e28" (UID: "91872da6-36e7-41f3-946e-160f8a255e28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.554578 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.554637 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wp6nz\" (UniqueName: \"kubernetes.io/projected/91872da6-36e7-41f3-946e-160f8a255e28-kube-api-access-wp6nz\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.554652 5200 reconciler_common.go:299] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/91872da6-36e7-41f3-946e-160f8a255e28-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.554661 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91872da6-36e7-41f3-946e-160f8a255e28-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.554670 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91872da6-36e7-41f3-946e-160f8a255e28-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.583589 5200 scope.go:117] "RemoveContainer" containerID="81c362e05144c036d596398dcd8fb84c2037aba9ee33addd0ef4714b0cbcf99d" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.600208 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547f477b6f-tzg2z" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.600553 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547f477b6f-tzg2z" event={"ID":"91872da6-36e7-41f3-946e-160f8a255e28","Type":"ContainerDied","Data":"bf5cdf9e6a99860503ac34ec0613fa8b76cc4f3a80a636a52162fedc80568f39"} Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.600608 5200 scope.go:117] "RemoveContainer" containerID="10f41548671855c11947a05f4a9a9bcae419325fd607745d8080478b44dd5c00" Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.664561 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-547f477b6f-tzg2z"] Nov 26 15:11:29 crc kubenswrapper[5200]: I1126 15:11:29.675161 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-547f477b6f-tzg2z"] Nov 26 15:11:30 crc kubenswrapper[5200]: I1126 15:11:30.280550 5200 scope.go:117] "RemoveContainer" containerID="e43f3e0a42eb44cc325c4c9c25910365ef5b6b62cdb4b48779899fc76ad9aaa4" Nov 26 15:11:30 crc kubenswrapper[5200]: I1126 15:11:30.491561 5200 scope.go:117] "RemoveContainer" containerID="34abfa6d0570f774fe8e957708178d31f9a5b609d93d963d68bd86470dd6b05a" Nov 26 15:11:30 crc kubenswrapper[5200]: I1126 15:11:30.512394 5200 scope.go:117] "RemoveContainer" containerID="d6f4bd23875b61dc826f2579db44df7a38ac0145070b9a05418474889fe09146" Nov 26 15:11:31 crc kubenswrapper[5200]: I1126 15:11:31.000904 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91872da6-36e7-41f3-946e-160f8a255e28" path="/var/lib/kubelet/pods/91872da6-36e7-41f3-946e-160f8a255e28/volumes" Nov 26 15:11:32 crc kubenswrapper[5200]: I1126 15:11:32.142411 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:11:33 crc kubenswrapper[5200]: I1126 15:11:33.953088 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:11:34 crc kubenswrapper[5200]: I1126 15:11:34.022826 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86dccf9c9c-th5p7"] Nov 26 15:11:34 crc kubenswrapper[5200]: I1126 15:11:34.023073 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-86dccf9c9c-th5p7" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon-log" containerID="cri-o://4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e" gracePeriod=30 Nov 26 15:11:34 crc kubenswrapper[5200]: I1126 15:11:34.023487 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-86dccf9c9c-th5p7" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon" containerID="cri-o://227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087" gracePeriod=30 Nov 26 15:11:36 crc kubenswrapper[5200]: I1126 15:11:36.068107 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-n5j7m"] Nov 26 15:11:36 crc kubenswrapper[5200]: I1126 15:11:36.079229 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-1d2c-account-create-update-skhvb"] Nov 26 15:11:36 crc kubenswrapper[5200]: I1126 15:11:36.090171 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-n5j7m"] Nov 26 15:11:36 crc kubenswrapper[5200]: I1126 15:11:36.109534 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1d2c-account-create-update-skhvb"] Nov 26 15:11:36 crc kubenswrapper[5200]: I1126 15:11:36.980909 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94489885-78b5-4375-8728-0ea952a5ae0d" path="/var/lib/kubelet/pods/94489885-78b5-4375-8728-0ea952a5ae0d/volumes" Nov 26 15:11:36 crc kubenswrapper[5200]: I1126 15:11:36.981895 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed77c7b7-3c1a-494e-a213-d83443e9692c" path="/var/lib/kubelet/pods/ed77c7b7-3c1a-494e-a213-d83443e9692c/volumes" Nov 26 15:11:37 crc kubenswrapper[5200]: I1126 15:11:37.691114 5200 generic.go:358] "Generic (PLEG): container finished" podID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerID="227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087" exitCode=0 Nov 26 15:11:37 crc kubenswrapper[5200]: I1126 15:11:37.691194 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dccf9c9c-th5p7" event={"ID":"29864ea7-0c2b-46ae-aa60-93b6a8318380","Type":"ContainerDied","Data":"227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087"} Nov 26 15:11:38 crc kubenswrapper[5200]: I1126 15:11:38.970015 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:11:40 crc kubenswrapper[5200]: I1126 15:11:40.137611 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/horizon-86dccf9c9c-th5p7" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 26 15:11:40 crc kubenswrapper[5200]: I1126 15:11:40.729511 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"1682d578e628d48f9c00a7583ad02f38b321739db47ca670781a2baf1284d737"} Nov 26 15:11:43 crc kubenswrapper[5200]: E1126 15:11:43.112118 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2098180 actualBytes=10240 Nov 26 15:11:45 crc kubenswrapper[5200]: I1126 15:11:45.031540 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-z6gzq"] Nov 26 15:11:45 crc kubenswrapper[5200]: I1126 15:11:45.041520 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-z6gzq"] Nov 26 15:11:46 crc kubenswrapper[5200]: I1126 15:11:46.984536 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88" path="/var/lib/kubelet/pods/80a9d0ed-a11c-4e30-b3f0-52ebf9eb6e88/volumes" Nov 26 15:11:50 crc kubenswrapper[5200]: I1126 15:11:50.138566 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/horizon-86dccf9c9c-th5p7" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 26 15:12:00 crc kubenswrapper[5200]: I1126 15:12:00.138959 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/horizon-86dccf9c9c-th5p7" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 26 15:12:00 crc kubenswrapper[5200]: I1126 15:12:00.140067 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.458162 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.534128 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-config-data\") pod \"29864ea7-0c2b-46ae-aa60-93b6a8318380\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.534594 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29864ea7-0c2b-46ae-aa60-93b6a8318380-logs\") pod \"29864ea7-0c2b-46ae-aa60-93b6a8318380\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.534845 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcgxd\" (UniqueName: \"kubernetes.io/projected/29864ea7-0c2b-46ae-aa60-93b6a8318380-kube-api-access-fcgxd\") pod \"29864ea7-0c2b-46ae-aa60-93b6a8318380\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.535042 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-scripts\") pod \"29864ea7-0c2b-46ae-aa60-93b6a8318380\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.535087 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29864ea7-0c2b-46ae-aa60-93b6a8318380-logs" (OuterVolumeSpecName: "logs") pod "29864ea7-0c2b-46ae-aa60-93b6a8318380" (UID: "29864ea7-0c2b-46ae-aa60-93b6a8318380"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.535328 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29864ea7-0c2b-46ae-aa60-93b6a8318380-horizon-secret-key\") pod \"29864ea7-0c2b-46ae-aa60-93b6a8318380\" (UID: \"29864ea7-0c2b-46ae-aa60-93b6a8318380\") " Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.535777 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29864ea7-0c2b-46ae-aa60-93b6a8318380-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.540233 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29864ea7-0c2b-46ae-aa60-93b6a8318380-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "29864ea7-0c2b-46ae-aa60-93b6a8318380" (UID: "29864ea7-0c2b-46ae-aa60-93b6a8318380"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.548527 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29864ea7-0c2b-46ae-aa60-93b6a8318380-kube-api-access-fcgxd" (OuterVolumeSpecName: "kube-api-access-fcgxd") pod "29864ea7-0c2b-46ae-aa60-93b6a8318380" (UID: "29864ea7-0c2b-46ae-aa60-93b6a8318380"). InnerVolumeSpecName "kube-api-access-fcgxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.560788 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-scripts" (OuterVolumeSpecName: "scripts") pod "29864ea7-0c2b-46ae-aa60-93b6a8318380" (UID: "29864ea7-0c2b-46ae-aa60-93b6a8318380"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.561298 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-config-data" (OuterVolumeSpecName: "config-data") pod "29864ea7-0c2b-46ae-aa60-93b6a8318380" (UID: "29864ea7-0c2b-46ae-aa60-93b6a8318380"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.637855 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcgxd\" (UniqueName: \"kubernetes.io/projected/29864ea7-0c2b-46ae-aa60-93b6a8318380-kube-api-access-fcgxd\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.638181 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.638289 5200 reconciler_common.go:299] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29864ea7-0c2b-46ae-aa60-93b6a8318380-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:04 crc kubenswrapper[5200]: I1126 15:12:04.638385 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29864ea7-0c2b-46ae-aa60-93b6a8318380-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.003247 5200 generic.go:358] "Generic (PLEG): container finished" podID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerID="4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e" exitCode=137 Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.003308 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dccf9c9c-th5p7" event={"ID":"29864ea7-0c2b-46ae-aa60-93b6a8318380","Type":"ContainerDied","Data":"4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e"} Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.003385 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86dccf9c9c-th5p7" Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.004153 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dccf9c9c-th5p7" event={"ID":"29864ea7-0c2b-46ae-aa60-93b6a8318380","Type":"ContainerDied","Data":"9df56eb9f0ae3afb1d007dbc3a3001ab88093aba19f8d3a3be8db4652eb60a19"} Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.004280 5200 scope.go:117] "RemoveContainer" containerID="227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087" Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.032407 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86dccf9c9c-th5p7"] Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.041084 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86dccf9c9c-th5p7"] Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.196217 5200 scope.go:117] "RemoveContainer" containerID="4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e" Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.228697 5200 scope.go:117] "RemoveContainer" containerID="227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087" Nov 26 15:12:05 crc kubenswrapper[5200]: E1126 15:12:05.231235 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087\": container with ID starting with 227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087 not found: ID does not exist" containerID="227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087" Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.231292 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087"} err="failed to get container status \"227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087\": rpc error: code = NotFound desc = could not find container \"227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087\": container with ID starting with 227d03e112a6410023fe8b264ee0883327ac37f1c6173104c1ff090d77406087 not found: ID does not exist" Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.231319 5200 scope.go:117] "RemoveContainer" containerID="4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e" Nov 26 15:12:05 crc kubenswrapper[5200]: E1126 15:12:05.231904 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e\": container with ID starting with 4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e not found: ID does not exist" containerID="4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e" Nov 26 15:12:05 crc kubenswrapper[5200]: I1126 15:12:05.231969 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e"} err="failed to get container status \"4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e\": rpc error: code = NotFound desc = could not find container \"4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e\": container with ID starting with 4366a1b1a3e68d15ab6ec29da354daa75477b5e3a978755d500b934bb1cfd35e not found: ID does not exist" Nov 26 15:12:06 crc kubenswrapper[5200]: I1126 15:12:06.994243 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" path="/var/lib/kubelet/pods/29864ea7-0c2b-46ae-aa60-93b6a8318380/volumes" Nov 26 15:12:17 crc kubenswrapper[5200]: I1126 15:12:17.053091 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8c2rg"] Nov 26 15:12:17 crc kubenswrapper[5200]: I1126 15:12:17.068577 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-0fd7-account-create-update-skfbq"] Nov 26 15:12:17 crc kubenswrapper[5200]: I1126 15:12:17.076247 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8c2rg"] Nov 26 15:12:17 crc kubenswrapper[5200]: I1126 15:12:17.083009 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0fd7-account-create-update-skfbq"] Nov 26 15:12:18 crc kubenswrapper[5200]: I1126 15:12:18.988742 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f1d368-42ab-4fcf-9938-638ee6cc0734" path="/var/lib/kubelet/pods/19f1d368-42ab-4fcf-9938-638ee6cc0734/volumes" Nov 26 15:12:18 crc kubenswrapper[5200]: I1126 15:12:18.989528 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6b73eb-d75d-4660-b0bd-3095b7197ec7" path="/var/lib/kubelet/pods/7f6b73eb-d75d-4660-b0bd-3095b7197ec7/volumes" Nov 26 15:12:23 crc kubenswrapper[5200]: I1126 15:12:23.047045 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zs9hq"] Nov 26 15:12:23 crc kubenswrapper[5200]: I1126 15:12:23.057672 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zs9hq"] Nov 26 15:12:24 crc kubenswrapper[5200]: I1126 15:12:24.983632 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047cdc76-603e-4cd3-a38c-10d473ec14f3" path="/var/lib/kubelet/pods/047cdc76-603e-4cd3-a38c-10d473ec14f3/volumes" Nov 26 15:12:30 crc kubenswrapper[5200]: I1126 15:12:30.867905 5200 scope.go:117] "RemoveContainer" containerID="8c982c4bd2ad1f0fb0adb81722788c3ebbefc29eaa35db0d7852776203276a8d" Nov 26 15:12:30 crc kubenswrapper[5200]: I1126 15:12:30.933375 5200 scope.go:117] "RemoveContainer" containerID="1b1d687c2f10d9506213f440666689318dadba51c6186795df7628aec2c3bc42" Nov 26 15:12:30 crc kubenswrapper[5200]: I1126 15:12:30.972342 5200 scope.go:117] "RemoveContainer" containerID="2bcce92ad40ea4f23f9aa07247cf89ee67ea1900992828080e5efef46e2926f8" Nov 26 15:12:31 crc kubenswrapper[5200]: I1126 15:12:31.029371 5200 scope.go:117] "RemoveContainer" containerID="039d36707774de098551c0fac8fb6c7ebfc05f6cfa89e3498b9025cfb5e15c1a" Nov 26 15:12:31 crc kubenswrapper[5200]: I1126 15:12:31.114268 5200 scope.go:117] "RemoveContainer" containerID="ef195788cc4a2fe50a4020cb9e87cdcd0018720ba15f052926059a68ac16e368" Nov 26 15:12:31 crc kubenswrapper[5200]: I1126 15:12:31.145001 5200 scope.go:117] "RemoveContainer" containerID="9af90665d53563ad73d20ae88b657ea66b87675b9ec77088a8a9225b3ee37357" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.451150 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-78d4d4fccc-2h985"] Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.456818 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon-log" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.459501 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon-log" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.459601 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91872da6-36e7-41f3-946e-160f8a255e28" containerName="horizon" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.459610 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="91872da6-36e7-41f3-946e-160f8a255e28" containerName="horizon" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.459644 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.459650 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.459793 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91872da6-36e7-41f3-946e-160f8a255e28" containerName="horizon-log" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.459800 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="91872da6-36e7-41f3-946e-160f8a255e28" containerName="horizon-log" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.460396 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="91872da6-36e7-41f3-946e-160f8a255e28" containerName="horizon" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.460425 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.460437 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="91872da6-36e7-41f3-946e-160f8a255e28" containerName="horizon-log" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.460449 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="29864ea7-0c2b-46ae-aa60-93b6a8318380" containerName="horizon-log" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.469224 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.503458 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78d4d4fccc-2h985"] Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.569557 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4562c0bf-dfab-46f9-848a-b6819d557283-logs\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.569719 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4562c0bf-dfab-46f9-848a-b6819d557283-horizon-secret-key\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.569762 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4562c0bf-dfab-46f9-848a-b6819d557283-config-data\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.569791 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4562c0bf-dfab-46f9-848a-b6819d557283-scripts\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.569840 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx9dk\" (UniqueName: \"kubernetes.io/projected/4562c0bf-dfab-46f9-848a-b6819d557283-kube-api-access-mx9dk\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.672010 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4562c0bf-dfab-46f9-848a-b6819d557283-horizon-secret-key\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.672070 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4562c0bf-dfab-46f9-848a-b6819d557283-config-data\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.672098 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4562c0bf-dfab-46f9-848a-b6819d557283-scripts\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.672147 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9dk\" (UniqueName: \"kubernetes.io/projected/4562c0bf-dfab-46f9-848a-b6819d557283-kube-api-access-mx9dk\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.672199 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4562c0bf-dfab-46f9-848a-b6819d557283-logs\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.672608 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4562c0bf-dfab-46f9-848a-b6819d557283-logs\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.673452 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4562c0bf-dfab-46f9-848a-b6819d557283-scripts\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.673836 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4562c0bf-dfab-46f9-848a-b6819d557283-config-data\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.683272 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4562c0bf-dfab-46f9-848a-b6819d557283-horizon-secret-key\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.693066 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx9dk\" (UniqueName: \"kubernetes.io/projected/4562c0bf-dfab-46f9-848a-b6819d557283-kube-api-access-mx9dk\") pod \"horizon-78d4d4fccc-2h985\" (UID: \"4562c0bf-dfab-46f9-848a-b6819d557283\") " pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:40 crc kubenswrapper[5200]: I1126 15:12:40.813278 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.273780 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78d4d4fccc-2h985"] Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.407789 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78d4d4fccc-2h985" event={"ID":"4562c0bf-dfab-46f9-848a-b6819d557283","Type":"ContainerStarted","Data":"c99c2cb18f5f4c63087fef90b3b4b2f208783486381595e1784296aa8a39a6eb"} Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.601588 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-8cbf7"] Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.610444 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.627495 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-8cbf7"] Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.701639 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79392be7-8cc7-4cda-90ff-0ef8d41597a0-operator-scripts\") pod \"heat-db-create-8cbf7\" (UID: \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\") " pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.702041 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htg88\" (UniqueName: \"kubernetes.io/projected/79392be7-8cc7-4cda-90ff-0ef8d41597a0-kube-api-access-htg88\") pod \"heat-db-create-8cbf7\" (UID: \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\") " pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.721803 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/heat-34b5-account-create-update-sjvgl"] Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.729508 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.734403 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"heat-db-secret\"" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.746750 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-34b5-account-create-update-sjvgl"] Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.803591 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d82eee2-b417-48ba-ac6c-51129e481b5f-operator-scripts\") pod \"heat-34b5-account-create-update-sjvgl\" (UID: \"8d82eee2-b417-48ba-ac6c-51129e481b5f\") " pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.804081 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qzjj\" (UniqueName: \"kubernetes.io/projected/8d82eee2-b417-48ba-ac6c-51129e481b5f-kube-api-access-5qzjj\") pod \"heat-34b5-account-create-update-sjvgl\" (UID: \"8d82eee2-b417-48ba-ac6c-51129e481b5f\") " pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.804263 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79392be7-8cc7-4cda-90ff-0ef8d41597a0-operator-scripts\") pod \"heat-db-create-8cbf7\" (UID: \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\") " pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.804358 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htg88\" (UniqueName: \"kubernetes.io/projected/79392be7-8cc7-4cda-90ff-0ef8d41597a0-kube-api-access-htg88\") pod \"heat-db-create-8cbf7\" (UID: \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\") " pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.805232 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79392be7-8cc7-4cda-90ff-0ef8d41597a0-operator-scripts\") pod \"heat-db-create-8cbf7\" (UID: \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\") " pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.836991 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htg88\" (UniqueName: \"kubernetes.io/projected/79392be7-8cc7-4cda-90ff-0ef8d41597a0-kube-api-access-htg88\") pod \"heat-db-create-8cbf7\" (UID: \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\") " pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.907500 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qzjj\" (UniqueName: \"kubernetes.io/projected/8d82eee2-b417-48ba-ac6c-51129e481b5f-kube-api-access-5qzjj\") pod \"heat-34b5-account-create-update-sjvgl\" (UID: \"8d82eee2-b417-48ba-ac6c-51129e481b5f\") " pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.907734 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d82eee2-b417-48ba-ac6c-51129e481b5f-operator-scripts\") pod \"heat-34b5-account-create-update-sjvgl\" (UID: \"8d82eee2-b417-48ba-ac6c-51129e481b5f\") " pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.908430 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d82eee2-b417-48ba-ac6c-51129e481b5f-operator-scripts\") pod \"heat-34b5-account-create-update-sjvgl\" (UID: \"8d82eee2-b417-48ba-ac6c-51129e481b5f\") " pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.922855 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qzjj\" (UniqueName: \"kubernetes.io/projected/8d82eee2-b417-48ba-ac6c-51129e481b5f-kube-api-access-5qzjj\") pod \"heat-34b5-account-create-update-sjvgl\" (UID: \"8d82eee2-b417-48ba-ac6c-51129e481b5f\") " pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:41 crc kubenswrapper[5200]: I1126 15:12:41.940556 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:42 crc kubenswrapper[5200]: I1126 15:12:42.057629 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:42 crc kubenswrapper[5200]: I1126 15:12:42.427820 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78d4d4fccc-2h985" event={"ID":"4562c0bf-dfab-46f9-848a-b6819d557283","Type":"ContainerStarted","Data":"1dade5a529a6bf2f839162e2f74747fa34d1eb093d75cec8b61af9e01511b48d"} Nov 26 15:12:42 crc kubenswrapper[5200]: I1126 15:12:42.428236 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78d4d4fccc-2h985" event={"ID":"4562c0bf-dfab-46f9-848a-b6819d557283","Type":"ContainerStarted","Data":"21b1732e46a893cbf3f45b35434d0e2dba48260a4a09cd9ec5f0a49614d17e60"} Nov 26 15:12:42 crc kubenswrapper[5200]: I1126 15:12:42.462241 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78d4d4fccc-2h985" podStartSLOduration=2.462221074 podStartE2EDuration="2.462221074s" podCreationTimestamp="2025-11-26 15:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:12:42.449951887 +0000 UTC m=+6131.129153715" watchObservedRunningTime="2025-11-26 15:12:42.462221074 +0000 UTC m=+6131.141422892" Nov 26 15:12:42 crc kubenswrapper[5200]: W1126 15:12:42.478010 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79392be7_8cc7_4cda_90ff_0ef8d41597a0.slice/crio-f25a8d5ee3e29110a5d7a676228fd3fc922738620c52ed2458d21aee3399a1cf WatchSource:0}: Error finding container f25a8d5ee3e29110a5d7a676228fd3fc922738620c52ed2458d21aee3399a1cf: Status 404 returned error can't find the container with id f25a8d5ee3e29110a5d7a676228fd3fc922738620c52ed2458d21aee3399a1cf Nov 26 15:12:42 crc kubenswrapper[5200]: I1126 15:12:42.479258 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-8cbf7"] Nov 26 15:12:42 crc kubenswrapper[5200]: W1126 15:12:42.567298 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d82eee2_b417_48ba_ac6c_51129e481b5f.slice/crio-54d02e8c37082aa304c310693ddd7055d5708693f718055098ffd0fc9b1b3bfe WatchSource:0}: Error finding container 54d02e8c37082aa304c310693ddd7055d5708693f718055098ffd0fc9b1b3bfe: Status 404 returned error can't find the container with id 54d02e8c37082aa304c310693ddd7055d5708693f718055098ffd0fc9b1b3bfe Nov 26 15:12:42 crc kubenswrapper[5200]: I1126 15:12:42.571274 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-34b5-account-create-update-sjvgl"] Nov 26 15:12:43 crc kubenswrapper[5200]: E1126 15:12:43.064738 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2096207 actualBytes=10240 Nov 26 15:12:43 crc kubenswrapper[5200]: I1126 15:12:43.437235 5200 generic.go:358] "Generic (PLEG): container finished" podID="8d82eee2-b417-48ba-ac6c-51129e481b5f" containerID="c9b1e8020d2ad944ca4de4276a99497cb27a05da868b8098fafab7020172c776" exitCode=0 Nov 26 15:12:43 crc kubenswrapper[5200]: I1126 15:12:43.437306 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-34b5-account-create-update-sjvgl" event={"ID":"8d82eee2-b417-48ba-ac6c-51129e481b5f","Type":"ContainerDied","Data":"c9b1e8020d2ad944ca4de4276a99497cb27a05da868b8098fafab7020172c776"} Nov 26 15:12:43 crc kubenswrapper[5200]: I1126 15:12:43.437652 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-34b5-account-create-update-sjvgl" event={"ID":"8d82eee2-b417-48ba-ac6c-51129e481b5f","Type":"ContainerStarted","Data":"54d02e8c37082aa304c310693ddd7055d5708693f718055098ffd0fc9b1b3bfe"} Nov 26 15:12:43 crc kubenswrapper[5200]: I1126 15:12:43.439138 5200 generic.go:358] "Generic (PLEG): container finished" podID="79392be7-8cc7-4cda-90ff-0ef8d41597a0" containerID="30d190782117914a809c0102acd0e7fdaf534f467bff912bc662541476bc4a22" exitCode=0 Nov 26 15:12:43 crc kubenswrapper[5200]: I1126 15:12:43.439178 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8cbf7" event={"ID":"79392be7-8cc7-4cda-90ff-0ef8d41597a0","Type":"ContainerDied","Data":"30d190782117914a809c0102acd0e7fdaf534f467bff912bc662541476bc4a22"} Nov 26 15:12:43 crc kubenswrapper[5200]: I1126 15:12:43.439236 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8cbf7" event={"ID":"79392be7-8cc7-4cda-90ff-0ef8d41597a0","Type":"ContainerStarted","Data":"f25a8d5ee3e29110a5d7a676228fd3fc922738620c52ed2458d21aee3399a1cf"} Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.885401 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.892205 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.980049 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qzjj\" (UniqueName: \"kubernetes.io/projected/8d82eee2-b417-48ba-ac6c-51129e481b5f-kube-api-access-5qzjj\") pod \"8d82eee2-b417-48ba-ac6c-51129e481b5f\" (UID: \"8d82eee2-b417-48ba-ac6c-51129e481b5f\") " Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.980507 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d82eee2-b417-48ba-ac6c-51129e481b5f-operator-scripts\") pod \"8d82eee2-b417-48ba-ac6c-51129e481b5f\" (UID: \"8d82eee2-b417-48ba-ac6c-51129e481b5f\") " Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.980792 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79392be7-8cc7-4cda-90ff-0ef8d41597a0-operator-scripts\") pod \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\" (UID: \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\") " Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.981031 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htg88\" (UniqueName: \"kubernetes.io/projected/79392be7-8cc7-4cda-90ff-0ef8d41597a0-kube-api-access-htg88\") pod \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\" (UID: \"79392be7-8cc7-4cda-90ff-0ef8d41597a0\") " Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.981435 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79392be7-8cc7-4cda-90ff-0ef8d41597a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79392be7-8cc7-4cda-90ff-0ef8d41597a0" (UID: "79392be7-8cc7-4cda-90ff-0ef8d41597a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.981580 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d82eee2-b417-48ba-ac6c-51129e481b5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d82eee2-b417-48ba-ac6c-51129e481b5f" (UID: "8d82eee2-b417-48ba-ac6c-51129e481b5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.983154 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d82eee2-b417-48ba-ac6c-51129e481b5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.983177 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79392be7-8cc7-4cda-90ff-0ef8d41597a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.986515 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79392be7-8cc7-4cda-90ff-0ef8d41597a0-kube-api-access-htg88" (OuterVolumeSpecName: "kube-api-access-htg88") pod "79392be7-8cc7-4cda-90ff-0ef8d41597a0" (UID: "79392be7-8cc7-4cda-90ff-0ef8d41597a0"). InnerVolumeSpecName "kube-api-access-htg88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:12:44 crc kubenswrapper[5200]: I1126 15:12:44.987440 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d82eee2-b417-48ba-ac6c-51129e481b5f-kube-api-access-5qzjj" (OuterVolumeSpecName: "kube-api-access-5qzjj") pod "8d82eee2-b417-48ba-ac6c-51129e481b5f" (UID: "8d82eee2-b417-48ba-ac6c-51129e481b5f"). InnerVolumeSpecName "kube-api-access-5qzjj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:12:45 crc kubenswrapper[5200]: I1126 15:12:45.085110 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-htg88\" (UniqueName: \"kubernetes.io/projected/79392be7-8cc7-4cda-90ff-0ef8d41597a0-kube-api-access-htg88\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:45 crc kubenswrapper[5200]: I1126 15:12:45.085147 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5qzjj\" (UniqueName: \"kubernetes.io/projected/8d82eee2-b417-48ba-ac6c-51129e481b5f-kube-api-access-5qzjj\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:45 crc kubenswrapper[5200]: I1126 15:12:45.466220 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8cbf7" event={"ID":"79392be7-8cc7-4cda-90ff-0ef8d41597a0","Type":"ContainerDied","Data":"f25a8d5ee3e29110a5d7a676228fd3fc922738620c52ed2458d21aee3399a1cf"} Nov 26 15:12:45 crc kubenswrapper[5200]: I1126 15:12:45.466640 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f25a8d5ee3e29110a5d7a676228fd3fc922738620c52ed2458d21aee3399a1cf" Nov 26 15:12:45 crc kubenswrapper[5200]: I1126 15:12:45.466342 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8cbf7" Nov 26 15:12:45 crc kubenswrapper[5200]: I1126 15:12:45.470101 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-34b5-account-create-update-sjvgl" Nov 26 15:12:45 crc kubenswrapper[5200]: I1126 15:12:45.470110 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-34b5-account-create-update-sjvgl" event={"ID":"8d82eee2-b417-48ba-ac6c-51129e481b5f","Type":"ContainerDied","Data":"54d02e8c37082aa304c310693ddd7055d5708693f718055098ffd0fc9b1b3bfe"} Nov 26 15:12:45 crc kubenswrapper[5200]: I1126 15:12:45.470231 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54d02e8c37082aa304c310693ddd7055d5708693f718055098ffd0fc9b1b3bfe" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.925803 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-4vfd4"] Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.927272 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79392be7-8cc7-4cda-90ff-0ef8d41597a0" containerName="mariadb-database-create" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.927297 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="79392be7-8cc7-4cda-90ff-0ef8d41597a0" containerName="mariadb-database-create" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.927377 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d82eee2-b417-48ba-ac6c-51129e481b5f" containerName="mariadb-account-create-update" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.927384 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d82eee2-b417-48ba-ac6c-51129e481b5f" containerName="mariadb-account-create-update" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.927566 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="79392be7-8cc7-4cda-90ff-0ef8d41597a0" containerName="mariadb-database-create" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.927593 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d82eee2-b417-48ba-ac6c-51129e481b5f" containerName="mariadb-account-create-update" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.953363 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-4vfd4"] Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.953502 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.957118 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"heat-heat-dockercfg-2pxd8\"" Nov 26 15:12:46 crc kubenswrapper[5200]: I1126 15:12:46.957428 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"heat-config-data\"" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.025860 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-config-data\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.025984 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-combined-ca-bundle\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.026188 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kxd\" (UniqueName: \"kubernetes.io/projected/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-kube-api-access-g5kxd\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.128023 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-config-data\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.128080 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-combined-ca-bundle\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.128198 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kxd\" (UniqueName: \"kubernetes.io/projected/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-kube-api-access-g5kxd\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.145259 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-combined-ca-bundle\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.145449 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-config-data\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.156185 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kxd\" (UniqueName: \"kubernetes.io/projected/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-kube-api-access-g5kxd\") pod \"heat-db-sync-4vfd4\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.278847 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:47 crc kubenswrapper[5200]: I1126 15:12:47.738764 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-4vfd4"] Nov 26 15:12:48 crc kubenswrapper[5200]: I1126 15:12:48.499615 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4vfd4" event={"ID":"f52475b3-abc6-48c1-ac1c-d351f7abf5b1","Type":"ContainerStarted","Data":"2cd6d4dc602fdfda0a70fe174e5b682ab2b455cdb48d3cf880ae0d455db0238b"} Nov 26 15:12:50 crc kubenswrapper[5200]: I1126 15:12:50.813810 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:50 crc kubenswrapper[5200]: I1126 15:12:50.814323 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:12:54 crc kubenswrapper[5200]: I1126 15:12:54.564577 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4vfd4" event={"ID":"f52475b3-abc6-48c1-ac1c-d351f7abf5b1","Type":"ContainerStarted","Data":"c9b149d661751e2c373e21272555502885bb03d20e9bf2d86571ccced1d07788"} Nov 26 15:12:54 crc kubenswrapper[5200]: I1126 15:12:54.588448 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-4vfd4" podStartSLOduration=2.244254454 podStartE2EDuration="8.588430083s" podCreationTimestamp="2025-11-26 15:12:46 +0000 UTC" firstStartedPulling="2025-11-26 15:12:47.759585434 +0000 UTC m=+6136.438787252" lastFinishedPulling="2025-11-26 15:12:54.103761063 +0000 UTC m=+6142.782962881" observedRunningTime="2025-11-26 15:12:54.580353498 +0000 UTC m=+6143.259555316" watchObservedRunningTime="2025-11-26 15:12:54.588430083 +0000 UTC m=+6143.267631901" Nov 26 15:12:56 crc kubenswrapper[5200]: I1126 15:12:56.594650 5200 generic.go:358] "Generic (PLEG): container finished" podID="f52475b3-abc6-48c1-ac1c-d351f7abf5b1" containerID="c9b149d661751e2c373e21272555502885bb03d20e9bf2d86571ccced1d07788" exitCode=0 Nov 26 15:12:56 crc kubenswrapper[5200]: I1126 15:12:56.594756 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4vfd4" event={"ID":"f52475b3-abc6-48c1-ac1c-d351f7abf5b1","Type":"ContainerDied","Data":"c9b149d661751e2c373e21272555502885bb03d20e9bf2d86571ccced1d07788"} Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.018147 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.069831 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-config-data\") pod \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.069911 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-combined-ca-bundle\") pod \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.070257 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5kxd\" (UniqueName: \"kubernetes.io/projected/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-kube-api-access-g5kxd\") pod \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\" (UID: \"f52475b3-abc6-48c1-ac1c-d351f7abf5b1\") " Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.082813 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-kube-api-access-g5kxd" (OuterVolumeSpecName: "kube-api-access-g5kxd") pod "f52475b3-abc6-48c1-ac1c-d351f7abf5b1" (UID: "f52475b3-abc6-48c1-ac1c-d351f7abf5b1"). InnerVolumeSpecName "kube-api-access-g5kxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.100436 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f52475b3-abc6-48c1-ac1c-d351f7abf5b1" (UID: "f52475b3-abc6-48c1-ac1c-d351f7abf5b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.154233 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-config-data" (OuterVolumeSpecName: "config-data") pod "f52475b3-abc6-48c1-ac1c-d351f7abf5b1" (UID: "f52475b3-abc6-48c1-ac1c-d351f7abf5b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.173102 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5kxd\" (UniqueName: \"kubernetes.io/projected/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-kube-api-access-g5kxd\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.173139 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.173149 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52475b3-abc6-48c1-ac1c-d351f7abf5b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.615039 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4vfd4" Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.615056 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4vfd4" event={"ID":"f52475b3-abc6-48c1-ac1c-d351f7abf5b1","Type":"ContainerDied","Data":"2cd6d4dc602fdfda0a70fe174e5b682ab2b455cdb48d3cf880ae0d455db0238b"} Nov 26 15:12:58 crc kubenswrapper[5200]: I1126 15:12:58.615102 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd6d4dc602fdfda0a70fe174e5b682ab2b455cdb48d3cf880ae0d455db0238b" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.655563 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5896b68b5c-qkmxp"] Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.660054 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f52475b3-abc6-48c1-ac1c-d351f7abf5b1" containerName="heat-db-sync" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.660083 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52475b3-abc6-48c1-ac1c-d351f7abf5b1" containerName="heat-db-sync" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.660281 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f52475b3-abc6-48c1-ac1c-d351f7abf5b1" containerName="heat-db-sync" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.793550 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5896b68b5c-qkmxp"] Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.793597 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7cc8b7d87-td966"] Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.796267 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.799891 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"heat-config-data\"" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.800159 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"heat-heat-dockercfg-2pxd8\"" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.800290 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"heat-engine-config-data\"" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.804854 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7cc8b7d87-td966"] Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.804984 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.813086 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"heat-api-config-data\"" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.823260 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-config-data-custom\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.823306 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxkd\" (UniqueName: \"kubernetes.io/projected/a7e04d58-74fa-4aae-97bc-5d139867905b-kube-api-access-fhxkd\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.823704 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-config-data\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.823798 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-combined-ca-bundle\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.860759 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5f6ff6448b-8b8fd"] Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.875632 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.894673 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5f6ff6448b-8b8fd"] Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.907185 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"heat-cfnapi-config-data\"" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.926736 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-config-data-custom\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.926829 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-config-data\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.926877 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-combined-ca-bundle\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.926915 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-config-data-custom\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.926954 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-config-data\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.926997 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pw8\" (UniqueName: \"kubernetes.io/projected/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-kube-api-access-52pw8\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.927014 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-combined-ca-bundle\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.927031 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-combined-ca-bundle\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.928030 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjx4d\" (UniqueName: \"kubernetes.io/projected/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-kube-api-access-qjx4d\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.928099 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-config-data-custom\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.928155 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxkd\" (UniqueName: \"kubernetes.io/projected/a7e04d58-74fa-4aae-97bc-5d139867905b-kube-api-access-fhxkd\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.928178 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-config-data\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.937933 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-config-data-custom\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.938133 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-combined-ca-bundle\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.947335 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxkd\" (UniqueName: \"kubernetes.io/projected/a7e04d58-74fa-4aae-97bc-5d139867905b-kube-api-access-fhxkd\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:12:59 crc kubenswrapper[5200]: I1126 15:12:59.971891 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e04d58-74fa-4aae-97bc-5d139867905b-config-data\") pod \"heat-engine-5896b68b5c-qkmxp\" (UID: \"a7e04d58-74fa-4aae-97bc-5d139867905b\") " pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.030925 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-combined-ca-bundle\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.031004 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-config-data-custom\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.031081 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52pw8\" (UniqueName: \"kubernetes.io/projected/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-kube-api-access-52pw8\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.031107 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-combined-ca-bundle\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.031216 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjx4d\" (UniqueName: \"kubernetes.io/projected/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-kube-api-access-qjx4d\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.031244 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-config-data\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.031313 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-config-data-custom\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.031426 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-config-data\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.038166 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-config-data\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.041618 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-combined-ca-bundle\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.045235 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-config-data-custom\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.048589 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-combined-ca-bundle\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.056372 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-config-data\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.056433 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pw8\" (UniqueName: \"kubernetes.io/projected/90191aa6-cceb-48f3-9af3-ed83fcb00f1a-kube-api-access-52pw8\") pod \"heat-api-7cc8b7d87-td966\" (UID: \"90191aa6-cceb-48f3-9af3-ed83fcb00f1a\") " pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.059565 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjx4d\" (UniqueName: \"kubernetes.io/projected/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-kube-api-access-qjx4d\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.063530 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fd4cbed-3800-4e75-8b5d-8a49a33debe0-config-data-custom\") pod \"heat-cfnapi-5f6ff6448b-8b8fd\" (UID: \"8fd4cbed-3800-4e75-8b5d-8a49a33debe0\") " pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.129153 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.200102 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.215613 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.675941 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5896b68b5c-qkmxp"] Nov 26 15:13:00 crc kubenswrapper[5200]: W1126 15:13:00.678267 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7e04d58_74fa_4aae_97bc_5d139867905b.slice/crio-b0efb28be8ba41d3fb796c4913dcd0f5ac8ca2c1f92b65c9d8a712c934179150 WatchSource:0}: Error finding container b0efb28be8ba41d3fb796c4913dcd0f5ac8ca2c1f92b65c9d8a712c934179150: Status 404 returned error can't find the container with id b0efb28be8ba41d3fb796c4913dcd0f5ac8ca2c1f92b65c9d8a712c934179150 Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.780835 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5f6ff6448b-8b8fd"] Nov 26 15:13:00 crc kubenswrapper[5200]: W1126 15:13:00.791293 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd4cbed_3800_4e75_8b5d_8a49a33debe0.slice/crio-8bf26a9cf935d1a1c18c02b28d102d7698f406dca37a837d7603bc382cea0416 WatchSource:0}: Error finding container 8bf26a9cf935d1a1c18c02b28d102d7698f406dca37a837d7603bc382cea0416: Status 404 returned error can't find the container with id 8bf26a9cf935d1a1c18c02b28d102d7698f406dca37a837d7603bc382cea0416 Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.816136 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/horizon-78d4d4fccc-2h985" podUID="4562c0bf-dfab-46f9-848a-b6819d557283" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Nov 26 15:13:00 crc kubenswrapper[5200]: I1126 15:13:00.872121 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7cc8b7d87-td966"] Nov 26 15:13:00 crc kubenswrapper[5200]: W1126 15:13:00.873530 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90191aa6_cceb_48f3_9af3_ed83fcb00f1a.slice/crio-0864922a7ea2dce31459f5681f605372ba38687c1138a8dce27ebe7100cfa638 WatchSource:0}: Error finding container 0864922a7ea2dce31459f5681f605372ba38687c1138a8dce27ebe7100cfa638: Status 404 returned error can't find the container with id 0864922a7ea2dce31459f5681f605372ba38687c1138a8dce27ebe7100cfa638 Nov 26 15:13:01 crc kubenswrapper[5200]: I1126 15:13:01.657798 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" event={"ID":"8fd4cbed-3800-4e75-8b5d-8a49a33debe0","Type":"ContainerStarted","Data":"8bf26a9cf935d1a1c18c02b28d102d7698f406dca37a837d7603bc382cea0416"} Nov 26 15:13:01 crc kubenswrapper[5200]: I1126 15:13:01.661701 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cc8b7d87-td966" event={"ID":"90191aa6-cceb-48f3-9af3-ed83fcb00f1a","Type":"ContainerStarted","Data":"0864922a7ea2dce31459f5681f605372ba38687c1138a8dce27ebe7100cfa638"} Nov 26 15:13:01 crc kubenswrapper[5200]: I1126 15:13:01.664292 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5896b68b5c-qkmxp" event={"ID":"a7e04d58-74fa-4aae-97bc-5d139867905b","Type":"ContainerStarted","Data":"dde9dfb4def81a162ee71130c4f02c41f65adc3b189814cf666d5132ed4f6d77"} Nov 26 15:13:01 crc kubenswrapper[5200]: I1126 15:13:01.664328 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5896b68b5c-qkmxp" event={"ID":"a7e04d58-74fa-4aae-97bc-5d139867905b","Type":"ContainerStarted","Data":"b0efb28be8ba41d3fb796c4913dcd0f5ac8ca2c1f92b65c9d8a712c934179150"} Nov 26 15:13:01 crc kubenswrapper[5200]: I1126 15:13:01.664540 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:13:01 crc kubenswrapper[5200]: I1126 15:13:01.691989 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5896b68b5c-qkmxp" podStartSLOduration=2.691964452 podStartE2EDuration="2.691964452s" podCreationTimestamp="2025-11-26 15:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:13:01.683192828 +0000 UTC m=+6150.362394646" watchObservedRunningTime="2025-11-26 15:13:01.691964452 +0000 UTC m=+6150.371166280" Nov 26 15:13:04 crc kubenswrapper[5200]: I1126 15:13:04.697577 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" event={"ID":"8fd4cbed-3800-4e75-8b5d-8a49a33debe0","Type":"ContainerStarted","Data":"42b0bc678f40c3433e474ba823e4691d829bb4c75673652048c41a6f9cd1d81e"} Nov 26 15:13:04 crc kubenswrapper[5200]: I1126 15:13:04.698287 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:04 crc kubenswrapper[5200]: I1126 15:13:04.700439 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cc8b7d87-td966" event={"ID":"90191aa6-cceb-48f3-9af3-ed83fcb00f1a","Type":"ContainerStarted","Data":"b823d9c82354832c0287802aa95e964335eb9f8cd432a2f0967f63a3497df28f"} Nov 26 15:13:04 crc kubenswrapper[5200]: I1126 15:13:04.700507 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:04 crc kubenswrapper[5200]: I1126 15:13:04.724431 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" podStartSLOduration=3.083210859 podStartE2EDuration="5.72440218s" podCreationTimestamp="2025-11-26 15:12:59 +0000 UTC" firstStartedPulling="2025-11-26 15:13:00.794444689 +0000 UTC m=+6149.473646507" lastFinishedPulling="2025-11-26 15:13:03.43563601 +0000 UTC m=+6152.114837828" observedRunningTime="2025-11-26 15:13:04.71123435 +0000 UTC m=+6153.390436178" watchObservedRunningTime="2025-11-26 15:13:04.72440218 +0000 UTC m=+6153.403604028" Nov 26 15:13:04 crc kubenswrapper[5200]: I1126 15:13:04.744745 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7cc8b7d87-td966" podStartSLOduration=3.192268638 podStartE2EDuration="5.74472312s" podCreationTimestamp="2025-11-26 15:12:59 +0000 UTC" firstStartedPulling="2025-11-26 15:13:00.878196115 +0000 UTC m=+6149.557397933" lastFinishedPulling="2025-11-26 15:13:03.430650597 +0000 UTC m=+6152.109852415" observedRunningTime="2025-11-26 15:13:04.733409309 +0000 UTC m=+6153.412611137" watchObservedRunningTime="2025-11-26 15:13:04.74472312 +0000 UTC m=+6153.423924948" Nov 26 15:13:12 crc kubenswrapper[5200]: I1126 15:13:12.063608 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5f6ff6448b-8b8fd" Nov 26 15:13:12 crc kubenswrapper[5200]: I1126 15:13:12.306433 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7cc8b7d87-td966" Nov 26 15:13:12 crc kubenswrapper[5200]: I1126 15:13:12.720020 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5896b68b5c-qkmxp" Nov 26 15:13:12 crc kubenswrapper[5200]: I1126 15:13:12.789492 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:13:15 crc kubenswrapper[5200]: I1126 15:13:15.117344 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-78d4d4fccc-2h985" Nov 26 15:13:15 crc kubenswrapper[5200]: I1126 15:13:15.208912 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fd9587865-d7gsp"] Nov 26 15:13:15 crc kubenswrapper[5200]: I1126 15:13:15.209173 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-6fd9587865-d7gsp" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon-log" containerID="cri-o://3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9" gracePeriod=30 Nov 26 15:13:15 crc kubenswrapper[5200]: I1126 15:13:15.209704 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-6fd9587865-d7gsp" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" containerID="cri-o://16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89" gracePeriod=30 Nov 26 15:13:18 crc kubenswrapper[5200]: I1126 15:13:18.877624 5200 generic.go:358] "Generic (PLEG): container finished" podID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerID="16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89" exitCode=0 Nov 26 15:13:18 crc kubenswrapper[5200]: I1126 15:13:18.877757 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fd9587865-d7gsp" event={"ID":"c0a1542c-d2dd-4963-9394-58e52fe057ff","Type":"ContainerDied","Data":"16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89"} Nov 26 15:13:22 crc kubenswrapper[5200]: I1126 15:13:22.143933 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/horizon-6fd9587865-d7gsp" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 26 15:13:27 crc kubenswrapper[5200]: I1126 15:13:27.086894 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-28zsn"] Nov 26 15:13:27 crc kubenswrapper[5200]: I1126 15:13:27.098143 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xcw24"] Nov 26 15:13:27 crc kubenswrapper[5200]: I1126 15:13:27.109080 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-28zsn"] Nov 26 15:13:27 crc kubenswrapper[5200]: I1126 15:13:27.117922 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xcw24"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.032733 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-d5lnx"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.053366 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-0e9e-account-create-update-plwlj"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.061832 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-69e3-account-create-update-f7fsr"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.069235 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-d5lnx"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.076748 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0192-account-create-update-5clsm"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.085126 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0192-account-create-update-5clsm"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.091852 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-0e9e-account-create-update-plwlj"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.099314 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-69e3-account-create-update-f7fsr"] Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.991046 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0595c8ee-0fcc-46d9-a97a-fff2b66a5284" path="/var/lib/kubelet/pods/0595c8ee-0fcc-46d9-a97a-fff2b66a5284/volumes" Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.991968 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c3d2845-4af4-4f14-ade5-afc0530f2ed1" path="/var/lib/kubelet/pods/6c3d2845-4af4-4f14-ade5-afc0530f2ed1/volumes" Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.992552 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710a41bd-779a-4224-82d6-03ba64697e32" path="/var/lib/kubelet/pods/710a41bd-779a-4224-82d6-03ba64697e32/volumes" Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.993095 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94225dcd-0095-452e-a108-5e98828f20a4" path="/var/lib/kubelet/pods/94225dcd-0095-452e-a108-5e98828f20a4/volumes" Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.994048 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e27d0974-edc9-4344-85a9-52cb419cafaa" path="/var/lib/kubelet/pods/e27d0974-edc9-4344-85a9-52cb419cafaa/volumes" Nov 26 15:13:28 crc kubenswrapper[5200]: I1126 15:13:28.994583 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88c5fd6-6888-43d2-8772-2055f386b9e7" path="/var/lib/kubelet/pods/f88c5fd6-6888-43d2-8772-2055f386b9e7/volumes" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.009210 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2"] Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.027441 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2"] Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.031938 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.035085 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.102840 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvkn\" (UniqueName: \"kubernetes.io/projected/798950a8-4762-4d77-9392-44c8323dca2c-kube-api-access-tnvkn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.102925 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.102982 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.205049 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvkn\" (UniqueName: \"kubernetes.io/projected/798950a8-4762-4d77-9392-44c8323dca2c-kube-api-access-tnvkn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.205101 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.205126 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.205597 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.205631 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.226106 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvkn\" (UniqueName: \"kubernetes.io/projected/798950a8-4762-4d77-9392-44c8323dca2c-kube-api-access-tnvkn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.363583 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:30 crc kubenswrapper[5200]: I1126 15:13:30.801314 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2"] Nov 26 15:13:31 crc kubenswrapper[5200]: I1126 15:13:31.020851 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" event={"ID":"798950a8-4762-4d77-9392-44c8323dca2c","Type":"ContainerStarted","Data":"0ab52b0e5387a83811faaca36be7ac60ef4337de27094edaeed36bf466a44e32"} Nov 26 15:13:31 crc kubenswrapper[5200]: I1126 15:13:31.365820 5200 scope.go:117] "RemoveContainer" containerID="5ee778c3eff81bdd1d8c96e4bad8259b8949eeedc0b3b26b9148196c3340228a" Nov 26 15:13:31 crc kubenswrapper[5200]: I1126 15:13:31.450166 5200 scope.go:117] "RemoveContainer" containerID="c976b0181a762b4bfac461ad1ebb6762b393ac061e8d00f04d9ae54d918d5d61" Nov 26 15:13:31 crc kubenswrapper[5200]: I1126 15:13:31.478834 5200 scope.go:117] "RemoveContainer" containerID="74661dd015674ffae44b2fe418694189e4802d55a0ea3924630f6d531a27914c" Nov 26 15:13:31 crc kubenswrapper[5200]: I1126 15:13:31.515023 5200 scope.go:117] "RemoveContainer" containerID="531d4cfc4a86580928b5de6f8775840ca6eeb81f9c7dfc187c6c5a2061edb467" Nov 26 15:13:31 crc kubenswrapper[5200]: I1126 15:13:31.565116 5200 scope.go:117] "RemoveContainer" containerID="53e39ae1a985831ea97c52a05bf5ae272295110be45a57975961604f31fbfbdb" Nov 26 15:13:31 crc kubenswrapper[5200]: I1126 15:13:31.607120 5200 scope.go:117] "RemoveContainer" containerID="81c0b0005c659bc955e129114ed7e262466dbd299273de69a8402c6c1df1204a" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.050116 5200 generic.go:358] "Generic (PLEG): container finished" podID="798950a8-4762-4d77-9392-44c8323dca2c" containerID="bd23d42857cf1a0ed1077b52401e5e4191d273b724fcb400d7e849024a33254f" exitCode=0 Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.050220 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" event={"ID":"798950a8-4762-4d77-9392-44c8323dca2c","Type":"ContainerDied","Data":"bd23d42857cf1a0ed1077b52401e5e4191d273b724fcb400d7e849024a33254f"} Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.144189 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/horizon-6fd9587865-d7gsp" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.359078 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tcstf"] Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.373919 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.382857 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tcstf"] Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.455923 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpdg\" (UniqueName: \"kubernetes.io/projected/d3b5b239-875c-4a28-ba71-1e4494a31d2d-kube-api-access-4cpdg\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.455999 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-catalog-content\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.456091 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-utilities\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.558436 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cpdg\" (UniqueName: \"kubernetes.io/projected/d3b5b239-875c-4a28-ba71-1e4494a31d2d-kube-api-access-4cpdg\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.558492 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-catalog-content\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.559265 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-utilities\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.560080 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-utilities\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.560070 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-catalog-content\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.584497 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cpdg\" (UniqueName: \"kubernetes.io/projected/d3b5b239-875c-4a28-ba71-1e4494a31d2d-kube-api-access-4cpdg\") pod \"redhat-operators-tcstf\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:32 crc kubenswrapper[5200]: I1126 15:13:32.743753 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:33 crc kubenswrapper[5200]: I1126 15:13:33.241013 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tcstf"] Nov 26 15:13:33 crc kubenswrapper[5200]: W1126 15:13:33.246400 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3b5b239_875c_4a28_ba71_1e4494a31d2d.slice/crio-e28ebfe88a5ed4d6d368033856b57e8a7d590b38c53d695b959b543dddf7d8b5 WatchSource:0}: Error finding container e28ebfe88a5ed4d6d368033856b57e8a7d590b38c53d695b959b543dddf7d8b5: Status 404 returned error can't find the container with id e28ebfe88a5ed4d6d368033856b57e8a7d590b38c53d695b959b543dddf7d8b5 Nov 26 15:13:34 crc kubenswrapper[5200]: I1126 15:13:34.069301 5200 generic.go:358] "Generic (PLEG): container finished" podID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerID="3e2c6974fa8a5f6c97fe0ad3b061bb6c58f79f4186bc2e3f89cf2c7b33622be9" exitCode=0 Nov 26 15:13:34 crc kubenswrapper[5200]: I1126 15:13:34.069498 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcstf" event={"ID":"d3b5b239-875c-4a28-ba71-1e4494a31d2d","Type":"ContainerDied","Data":"3e2c6974fa8a5f6c97fe0ad3b061bb6c58f79f4186bc2e3f89cf2c7b33622be9"} Nov 26 15:13:34 crc kubenswrapper[5200]: I1126 15:13:34.069846 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcstf" event={"ID":"d3b5b239-875c-4a28-ba71-1e4494a31d2d","Type":"ContainerStarted","Data":"e28ebfe88a5ed4d6d368033856b57e8a7d590b38c53d695b959b543dddf7d8b5"} Nov 26 15:13:35 crc kubenswrapper[5200]: I1126 15:13:35.081588 5200 generic.go:358] "Generic (PLEG): container finished" podID="798950a8-4762-4d77-9392-44c8323dca2c" containerID="3f2c9bd003539f68495bffff0f8ef0ad133a4f236b4cb02b600d34c250f69ec2" exitCode=0 Nov 26 15:13:35 crc kubenswrapper[5200]: I1126 15:13:35.081709 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" event={"ID":"798950a8-4762-4d77-9392-44c8323dca2c","Type":"ContainerDied","Data":"3f2c9bd003539f68495bffff0f8ef0ad133a4f236b4cb02b600d34c250f69ec2"} Nov 26 15:13:36 crc kubenswrapper[5200]: I1126 15:13:36.094889 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" event={"ID":"798950a8-4762-4d77-9392-44c8323dca2c","Type":"ContainerStarted","Data":"580ac14b3f21f964617581ad74164bce504c982f5cfb81b1c0d48df2db48cda7"} Nov 26 15:13:36 crc kubenswrapper[5200]: I1126 15:13:36.097487 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcstf" event={"ID":"d3b5b239-875c-4a28-ba71-1e4494a31d2d","Type":"ContainerStarted","Data":"3bed4b1d83bad7eacbb288c65180be71229d1f71b50b97c71f9c043a4fe46bc4"} Nov 26 15:13:36 crc kubenswrapper[5200]: I1126 15:13:36.116144 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" podStartSLOduration=4.754332112 podStartE2EDuration="7.116118068s" podCreationTimestamp="2025-11-26 15:13:29 +0000 UTC" firstStartedPulling="2025-11-26 15:13:32.052828874 +0000 UTC m=+6180.732030692" lastFinishedPulling="2025-11-26 15:13:34.41461483 +0000 UTC m=+6183.093816648" observedRunningTime="2025-11-26 15:13:36.111034243 +0000 UTC m=+6184.790236091" watchObservedRunningTime="2025-11-26 15:13:36.116118068 +0000 UTC m=+6184.795319896" Nov 26 15:13:37 crc kubenswrapper[5200]: I1126 15:13:37.113252 5200 generic.go:358] "Generic (PLEG): container finished" podID="798950a8-4762-4d77-9392-44c8323dca2c" containerID="580ac14b3f21f964617581ad74164bce504c982f5cfb81b1c0d48df2db48cda7" exitCode=0 Nov 26 15:13:37 crc kubenswrapper[5200]: I1126 15:13:37.113364 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" event={"ID":"798950a8-4762-4d77-9392-44c8323dca2c","Type":"ContainerDied","Data":"580ac14b3f21f964617581ad74164bce504c982f5cfb81b1c0d48df2db48cda7"} Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.478771 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.517351 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-bundle\") pod \"798950a8-4762-4d77-9392-44c8323dca2c\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.517482 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-util\") pod \"798950a8-4762-4d77-9392-44c8323dca2c\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.517528 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnvkn\" (UniqueName: \"kubernetes.io/projected/798950a8-4762-4d77-9392-44c8323dca2c-kube-api-access-tnvkn\") pod \"798950a8-4762-4d77-9392-44c8323dca2c\" (UID: \"798950a8-4762-4d77-9392-44c8323dca2c\") " Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.522370 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-bundle" (OuterVolumeSpecName: "bundle") pod "798950a8-4762-4d77-9392-44c8323dca2c" (UID: "798950a8-4762-4d77-9392-44c8323dca2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.526236 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798950a8-4762-4d77-9392-44c8323dca2c-kube-api-access-tnvkn" (OuterVolumeSpecName: "kube-api-access-tnvkn") pod "798950a8-4762-4d77-9392-44c8323dca2c" (UID: "798950a8-4762-4d77-9392-44c8323dca2c"). InnerVolumeSpecName "kube-api-access-tnvkn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.531111 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-util" (OuterVolumeSpecName: "util") pod "798950a8-4762-4d77-9392-44c8323dca2c" (UID: "798950a8-4762-4d77-9392-44c8323dca2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.621128 5200 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.621170 5200 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/798950a8-4762-4d77-9392-44c8323dca2c-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:38 crc kubenswrapper[5200]: I1126 15:13:38.621181 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tnvkn\" (UniqueName: \"kubernetes.io/projected/798950a8-4762-4d77-9392-44c8323dca2c-kube-api-access-tnvkn\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:39 crc kubenswrapper[5200]: I1126 15:13:39.137957 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" Nov 26 15:13:39 crc kubenswrapper[5200]: I1126 15:13:39.137978 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2" event={"ID":"798950a8-4762-4d77-9392-44c8323dca2c","Type":"ContainerDied","Data":"0ab52b0e5387a83811faaca36be7ac60ef4337de27094edaeed36bf466a44e32"} Nov 26 15:13:39 crc kubenswrapper[5200]: I1126 15:13:39.138020 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab52b0e5387a83811faaca36be7ac60ef4337de27094edaeed36bf466a44e32" Nov 26 15:13:39 crc kubenswrapper[5200]: I1126 15:13:39.140931 5200 generic.go:358] "Generic (PLEG): container finished" podID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerID="3bed4b1d83bad7eacbb288c65180be71229d1f71b50b97c71f9c043a4fe46bc4" exitCode=0 Nov 26 15:13:39 crc kubenswrapper[5200]: I1126 15:13:39.141246 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcstf" event={"ID":"d3b5b239-875c-4a28-ba71-1e4494a31d2d","Type":"ContainerDied","Data":"3bed4b1d83bad7eacbb288c65180be71229d1f71b50b97c71f9c043a4fe46bc4"} Nov 26 15:13:40 crc kubenswrapper[5200]: I1126 15:13:40.152948 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcstf" event={"ID":"d3b5b239-875c-4a28-ba71-1e4494a31d2d","Type":"ContainerStarted","Data":"f0e3bec0f1630ef51a91e699ff620eee48c868341901491a0517f3727ce9349e"} Nov 26 15:13:40 crc kubenswrapper[5200]: I1126 15:13:40.178805 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tcstf" podStartSLOduration=7.228324076 podStartE2EDuration="8.178787375s" podCreationTimestamp="2025-11-26 15:13:32 +0000 UTC" firstStartedPulling="2025-11-26 15:13:34.070562087 +0000 UTC m=+6182.749763905" lastFinishedPulling="2025-11-26 15:13:35.021025386 +0000 UTC m=+6183.700227204" observedRunningTime="2025-11-26 15:13:40.176017232 +0000 UTC m=+6188.855219070" watchObservedRunningTime="2025-11-26 15:13:40.178787375 +0000 UTC m=+6188.857989193" Nov 26 15:13:42 crc kubenswrapper[5200]: I1126 15:13:42.144539 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/horizon-6fd9587865-d7gsp" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 26 15:13:42 crc kubenswrapper[5200]: I1126 15:13:42.144961 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:13:42 crc kubenswrapper[5200]: I1126 15:13:42.743922 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:42 crc kubenswrapper[5200]: I1126 15:13:42.744303 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:13:43 crc kubenswrapper[5200]: I1126 15:13:43.059848 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bhmn"] Nov 26 15:13:43 crc kubenswrapper[5200]: I1126 15:13:43.072741 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bhmn"] Nov 26 15:13:43 crc kubenswrapper[5200]: E1126 15:13:43.681357 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2131282 actualBytes=10240 Nov 26 15:13:43 crc kubenswrapper[5200]: I1126 15:13:43.883361 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tcstf" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" probeResult="failure" output=< Nov 26 15:13:43 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:13:43 crc kubenswrapper[5200]: > Nov 26 15:13:44 crc kubenswrapper[5200]: I1126 15:13:44.987835 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd766e85-0f40-4106-ba90-a82abaf7e00b" path="/var/lib/kubelet/pods/bd766e85-0f40-4106-ba90-a82abaf7e00b/volumes" Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.899677 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.974137 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-config-data\") pod \"c0a1542c-d2dd-4963-9394-58e52fe057ff\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.974425 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-scripts\") pod \"c0a1542c-d2dd-4963-9394-58e52fe057ff\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.974638 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28vvd\" (UniqueName: \"kubernetes.io/projected/c0a1542c-d2dd-4963-9394-58e52fe057ff-kube-api-access-28vvd\") pod \"c0a1542c-d2dd-4963-9394-58e52fe057ff\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.974937 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0a1542c-d2dd-4963-9394-58e52fe057ff-horizon-secret-key\") pod \"c0a1542c-d2dd-4963-9394-58e52fe057ff\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.975075 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a1542c-d2dd-4963-9394-58e52fe057ff-logs\") pod \"c0a1542c-d2dd-4963-9394-58e52fe057ff\" (UID: \"c0a1542c-d2dd-4963-9394-58e52fe057ff\") " Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.976329 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a1542c-d2dd-4963-9394-58e52fe057ff-logs" (OuterVolumeSpecName: "logs") pod "c0a1542c-d2dd-4963-9394-58e52fe057ff" (UID: "c0a1542c-d2dd-4963-9394-58e52fe057ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.990639 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a1542c-d2dd-4963-9394-58e52fe057ff-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c0a1542c-d2dd-4963-9394-58e52fe057ff" (UID: "c0a1542c-d2dd-4963-9394-58e52fe057ff"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:13:45 crc kubenswrapper[5200]: I1126 15:13:45.990810 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a1542c-d2dd-4963-9394-58e52fe057ff-kube-api-access-28vvd" (OuterVolumeSpecName: "kube-api-access-28vvd") pod "c0a1542c-d2dd-4963-9394-58e52fe057ff" (UID: "c0a1542c-d2dd-4963-9394-58e52fe057ff"). InnerVolumeSpecName "kube-api-access-28vvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.013702 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-scripts" (OuterVolumeSpecName: "scripts") pod "c0a1542c-d2dd-4963-9394-58e52fe057ff" (UID: "c0a1542c-d2dd-4963-9394-58e52fe057ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.030078 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-config-data" (OuterVolumeSpecName: "config-data") pod "c0a1542c-d2dd-4963-9394-58e52fe057ff" (UID: "c0a1542c-d2dd-4963-9394-58e52fe057ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.079248 5200 reconciler_common.go:299] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0a1542c-d2dd-4963-9394-58e52fe057ff-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.079534 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a1542c-d2dd-4963-9394-58e52fe057ff-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.079625 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.079706 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a1542c-d2dd-4963-9394-58e52fe057ff-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.079781 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28vvd\" (UniqueName: \"kubernetes.io/projected/c0a1542c-d2dd-4963-9394-58e52fe057ff-kube-api-access-28vvd\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.212966 5200 generic.go:358] "Generic (PLEG): container finished" podID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerID="3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9" exitCode=137 Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.213105 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fd9587865-d7gsp" event={"ID":"c0a1542c-d2dd-4963-9394-58e52fe057ff","Type":"ContainerDied","Data":"3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9"} Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.213111 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fd9587865-d7gsp" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.213140 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fd9587865-d7gsp" event={"ID":"c0a1542c-d2dd-4963-9394-58e52fe057ff","Type":"ContainerDied","Data":"526159dbe93e58255bb1951d1631c9c732cf66b763db0f3b0e6bc439ef62161d"} Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.213157 5200 scope.go:117] "RemoveContainer" containerID="16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.269478 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fd9587865-d7gsp"] Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.325230 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fd9587865-d7gsp"] Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.389316 5200 scope.go:117] "RemoveContainer" containerID="3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.411618 5200 scope.go:117] "RemoveContainer" containerID="16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89" Nov 26 15:13:46 crc kubenswrapper[5200]: E1126 15:13:46.413178 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89\": container with ID starting with 16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89 not found: ID does not exist" containerID="16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.413213 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89"} err="failed to get container status \"16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89\": rpc error: code = NotFound desc = could not find container \"16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89\": container with ID starting with 16e92ec96133feba34fe79bed6d426af034f12b681287e974e027fbe5d07ae89 not found: ID does not exist" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.413239 5200 scope.go:117] "RemoveContainer" containerID="3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9" Nov 26 15:13:46 crc kubenswrapper[5200]: E1126 15:13:46.413562 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9\": container with ID starting with 3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9 not found: ID does not exist" containerID="3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.413582 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9"} err="failed to get container status \"3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9\": rpc error: code = NotFound desc = could not find container \"3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9\": container with ID starting with 3d607ef31a38c41510812ce4e7f10b610e2c5a6beb7437b793070bd826b838f9 not found: ID does not exist" Nov 26 15:13:46 crc kubenswrapper[5200]: I1126 15:13:46.979613 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" path="/var/lib/kubelet/pods/c0a1542c-d2dd-4963-9394-58e52fe057ff/volumes" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.637084 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-86648f486b-29tzs"] Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638239 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon-log" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638257 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon-log" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638295 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638302 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638315 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="798950a8-4762-4d77-9392-44c8323dca2c" containerName="extract" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638321 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="798950a8-4762-4d77-9392-44c8323dca2c" containerName="extract" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638365 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="798950a8-4762-4d77-9392-44c8323dca2c" containerName="pull" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638372 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="798950a8-4762-4d77-9392-44c8323dca2c" containerName="pull" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638396 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="798950a8-4762-4d77-9392-44c8323dca2c" containerName="util" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638401 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="798950a8-4762-4d77-9392-44c8323dca2c" containerName="util" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638587 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon-log" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638605 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="798950a8-4762-4d77-9392-44c8323dca2c" containerName="extract" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.638622 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0a1542c-d2dd-4963-9394-58e52fe057ff" containerName="horizon" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.647101 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-86648f486b-29tzs" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.648814 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-dockercfg-ls2xw\"" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.655492 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.656015 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.656319 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7"] Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.662077 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.667057 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-dockercfg-vhdnx\"" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.667379 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-86648f486b-29tzs"] Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.669618 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-service-cert\"" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.678263 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl"] Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.686561 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.719153 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7\" (UID: \"8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.720988 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea8c42ca-7b97-4758-834c-aecd4ea85609-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl\" (UID: \"ea8c42ca-7b97-4758-834c-aecd4ea85609\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.721035 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7\" (UID: \"8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.721087 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea8c42ca-7b97-4758-834c-aecd4ea85609-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl\" (UID: \"ea8c42ca-7b97-4758-834c-aecd4ea85609\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.721150 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pqw\" (UniqueName: \"kubernetes.io/projected/a65bbbf4-55fe-482b-acb0-fab23e8c3b8e-kube-api-access-52pqw\") pod \"obo-prometheus-operator-86648f486b-29tzs\" (UID: \"a65bbbf4-55fe-482b-acb0-fab23e8c3b8e\") " pod="openshift-operators/obo-prometheus-operator-86648f486b-29tzs" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.729726 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7"] Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.748601 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl"] Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.825308 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7\" (UID: \"8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.825428 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea8c42ca-7b97-4758-834c-aecd4ea85609-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl\" (UID: \"ea8c42ca-7b97-4758-834c-aecd4ea85609\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.825489 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7\" (UID: \"8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.825533 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea8c42ca-7b97-4758-834c-aecd4ea85609-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl\" (UID: \"ea8c42ca-7b97-4758-834c-aecd4ea85609\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.825590 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52pqw\" (UniqueName: \"kubernetes.io/projected/a65bbbf4-55fe-482b-acb0-fab23e8c3b8e-kube-api-access-52pqw\") pod \"obo-prometheus-operator-86648f486b-29tzs\" (UID: \"a65bbbf4-55fe-482b-acb0-fab23e8c3b8e\") " pod="openshift-operators/obo-prometheus-operator-86648f486b-29tzs" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.834570 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea8c42ca-7b97-4758-834c-aecd4ea85609-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl\" (UID: \"ea8c42ca-7b97-4758-834c-aecd4ea85609\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.834581 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea8c42ca-7b97-4758-834c-aecd4ea85609-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl\" (UID: \"ea8c42ca-7b97-4758-834c-aecd4ea85609\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.837957 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7\" (UID: \"8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.843222 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7\" (UID: \"8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.852697 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-78c97476f4-dlrhp"] Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.860092 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pqw\" (UniqueName: \"kubernetes.io/projected/a65bbbf4-55fe-482b-acb0-fab23e8c3b8e-kube-api-access-52pqw\") pod \"obo-prometheus-operator-86648f486b-29tzs\" (UID: \"a65bbbf4-55fe-482b-acb0-fab23e8c3b8e\") " pod="openshift-operators/obo-prometheus-operator-86648f486b-29tzs" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.861768 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.873913 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"observability-operator-tls\"" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.874380 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"observability-operator-sa-dockercfg-5twhh\"" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.878606 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-78c97476f4-dlrhp"] Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.927925 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg28s\" (UniqueName: \"kubernetes.io/projected/a00bd761-94c1-4961-ae82-4e1c2d7b8dae-kube-api-access-jg28s\") pod \"observability-operator-78c97476f4-dlrhp\" (UID: \"a00bd761-94c1-4961-ae82-4e1c2d7b8dae\") " pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.928001 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a00bd761-94c1-4961-ae82-4e1c2d7b8dae-observability-operator-tls\") pod \"observability-operator-78c97476f4-dlrhp\" (UID: \"a00bd761-94c1-4961-ae82-4e1c2d7b8dae\") " pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.970653 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-86648f486b-29tzs" Nov 26 15:13:47 crc kubenswrapper[5200]: I1126 15:13:47.989810 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.018082 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.030623 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg28s\" (UniqueName: \"kubernetes.io/projected/a00bd761-94c1-4961-ae82-4e1c2d7b8dae-kube-api-access-jg28s\") pod \"observability-operator-78c97476f4-dlrhp\" (UID: \"a00bd761-94c1-4961-ae82-4e1c2d7b8dae\") " pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.030829 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a00bd761-94c1-4961-ae82-4e1c2d7b8dae-observability-operator-tls\") pod \"observability-operator-78c97476f4-dlrhp\" (UID: \"a00bd761-94c1-4961-ae82-4e1c2d7b8dae\") " pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.038642 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a00bd761-94c1-4961-ae82-4e1c2d7b8dae-observability-operator-tls\") pod \"observability-operator-78c97476f4-dlrhp\" (UID: \"a00bd761-94c1-4961-ae82-4e1c2d7b8dae\") " pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.083775 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg28s\" (UniqueName: \"kubernetes.io/projected/a00bd761-94c1-4961-ae82-4e1c2d7b8dae-kube-api-access-jg28s\") pod \"observability-operator-78c97476f4-dlrhp\" (UID: \"a00bd761-94c1-4961-ae82-4e1c2d7b8dae\") " pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.113819 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-68bdb49cbf-jxcq6"] Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.123919 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.128112 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-68bdb49cbf-jxcq6"] Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.141994 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"perses-operator-dockercfg-djvzs\"" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.241001 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qvsq\" (UniqueName: \"kubernetes.io/projected/3fc7ee89-6096-441f-b376-fa628c95f1b5-kube-api-access-9qvsq\") pod \"perses-operator-68bdb49cbf-jxcq6\" (UID: \"3fc7ee89-6096-441f-b376-fa628c95f1b5\") " pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.241132 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3fc7ee89-6096-441f-b376-fa628c95f1b5-openshift-service-ca\") pod \"perses-operator-68bdb49cbf-jxcq6\" (UID: \"3fc7ee89-6096-441f-b376-fa628c95f1b5\") " pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.257205 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.347664 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qvsq\" (UniqueName: \"kubernetes.io/projected/3fc7ee89-6096-441f-b376-fa628c95f1b5-kube-api-access-9qvsq\") pod \"perses-operator-68bdb49cbf-jxcq6\" (UID: \"3fc7ee89-6096-441f-b376-fa628c95f1b5\") " pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.347863 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3fc7ee89-6096-441f-b376-fa628c95f1b5-openshift-service-ca\") pod \"perses-operator-68bdb49cbf-jxcq6\" (UID: \"3fc7ee89-6096-441f-b376-fa628c95f1b5\") " pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.348855 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/3fc7ee89-6096-441f-b376-fa628c95f1b5-openshift-service-ca\") pod \"perses-operator-68bdb49cbf-jxcq6\" (UID: \"3fc7ee89-6096-441f-b376-fa628c95f1b5\") " pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.371891 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qvsq\" (UniqueName: \"kubernetes.io/projected/3fc7ee89-6096-441f-b376-fa628c95f1b5-kube-api-access-9qvsq\") pod \"perses-operator-68bdb49cbf-jxcq6\" (UID: \"3fc7ee89-6096-441f-b376-fa628c95f1b5\") " pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.519258 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.629658 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-86648f486b-29tzs"] Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.724271 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl"] Nov 26 15:13:48 crc kubenswrapper[5200]: I1126 15:13:48.816007 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7"] Nov 26 15:13:49 crc kubenswrapper[5200]: W1126 15:13:49.000852 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda00bd761_94c1_4961_ae82_4e1c2d7b8dae.slice/crio-686a06880413b5fa9b2bb9b962d09f2d447f8ac503473a8f35fca6dea829c889 WatchSource:0}: Error finding container 686a06880413b5fa9b2bb9b962d09f2d447f8ac503473a8f35fca6dea829c889: Status 404 returned error can't find the container with id 686a06880413b5fa9b2bb9b962d09f2d447f8ac503473a8f35fca6dea829c889 Nov 26 15:13:49 crc kubenswrapper[5200]: I1126 15:13:49.003808 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-78c97476f4-dlrhp"] Nov 26 15:13:49 crc kubenswrapper[5200]: I1126 15:13:49.192663 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-68bdb49cbf-jxcq6"] Nov 26 15:13:49 crc kubenswrapper[5200]: I1126 15:13:49.270973 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" event={"ID":"3fc7ee89-6096-441f-b376-fa628c95f1b5","Type":"ContainerStarted","Data":"3aadae3314896adf13ab2735da82ef1c3165621007f5b7dc465702bfcf26cb7f"} Nov 26 15:13:49 crc kubenswrapper[5200]: I1126 15:13:49.276368 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-78c97476f4-dlrhp" event={"ID":"a00bd761-94c1-4961-ae82-4e1c2d7b8dae","Type":"ContainerStarted","Data":"686a06880413b5fa9b2bb9b962d09f2d447f8ac503473a8f35fca6dea829c889"} Nov 26 15:13:49 crc kubenswrapper[5200]: I1126 15:13:49.278617 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" event={"ID":"8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b","Type":"ContainerStarted","Data":"5c69b08197aba3ab065a6b9582a0c1035c96b3ed882fd5ffee9f42ec4c1e8519"} Nov 26 15:13:49 crc kubenswrapper[5200]: I1126 15:13:49.279935 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" event={"ID":"ea8c42ca-7b97-4758-834c-aecd4ea85609","Type":"ContainerStarted","Data":"92ce3a4dc5c17e06540eb658f016cf65c41c23a2527fd69847b669e679f51c4a"} Nov 26 15:13:49 crc kubenswrapper[5200]: I1126 15:13:49.281676 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-86648f486b-29tzs" event={"ID":"a65bbbf4-55fe-482b-acb0-fab23e8c3b8e","Type":"ContainerStarted","Data":"a8a1d804fe3a929683c56aa69f8809f34bfdc28d7bdecd1882ce9414d5bc3078"} Nov 26 15:13:53 crc kubenswrapper[5200]: I1126 15:13:53.846147 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tcstf" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" probeResult="failure" output=< Nov 26 15:13:53 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:13:53 crc kubenswrapper[5200]: > Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.154403 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2crgw"] Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.166411 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.175425 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crgw"] Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.331065 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb9z4\" (UniqueName: \"kubernetes.io/projected/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-kube-api-access-xb9z4\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.331122 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-catalog-content\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.331251 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-utilities\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.433745 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xb9z4\" (UniqueName: \"kubernetes.io/projected/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-kube-api-access-xb9z4\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.433809 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-catalog-content\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.433903 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-utilities\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.434357 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-catalog-content\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.434385 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-utilities\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.458537 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb9z4\" (UniqueName: \"kubernetes.io/projected/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-kube-api-access-xb9z4\") pod \"redhat-marketplace-2crgw\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:58 crc kubenswrapper[5200]: I1126 15:13:58.527770 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:13:59 crc kubenswrapper[5200]: I1126 15:13:59.815851 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crgw"] Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.045333 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k55nm"] Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.057928 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k55nm"] Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.510310 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" event={"ID":"3fc7ee89-6096-441f-b376-fa628c95f1b5","Type":"ContainerStarted","Data":"4f2c40c5b75275ca03d3b385c2b56d94a72ed3dc7292abff364b59283ae23ddc"} Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.510722 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.513560 5200 generic.go:358] "Generic (PLEG): container finished" podID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerID="ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca" exitCode=0 Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.513758 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crgw" event={"ID":"c3afe83c-83ce-46ee-9b6a-4c1bee33d130","Type":"ContainerDied","Data":"ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca"} Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.513824 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crgw" event={"ID":"c3afe83c-83ce-46ee-9b6a-4c1bee33d130","Type":"ContainerStarted","Data":"22e0a90f0563004a20d2408c6e91e1fe76a2b0dfadf98582118350dc28da4c21"} Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.517795 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-78c97476f4-dlrhp" event={"ID":"a00bd761-94c1-4961-ae82-4e1c2d7b8dae","Type":"ContainerStarted","Data":"0c88dc0547e31f895e65bbd257391877e7d9ec204a4868dfa09777c919addf6f"} Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.518674 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.522433 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" event={"ID":"8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b","Type":"ContainerStarted","Data":"765d12d8664f829144820e83a0a7fe9446d4b6ad82d20f4c7c1518ec76d59cf5"} Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.525991 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-78c97476f4-dlrhp" Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.528571 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" podStartSLOduration=2.534002579 podStartE2EDuration="12.528548679s" podCreationTimestamp="2025-11-26 15:13:48 +0000 UTC" firstStartedPulling="2025-11-26 15:13:49.237106674 +0000 UTC m=+6197.916308492" lastFinishedPulling="2025-11-26 15:13:59.231652774 +0000 UTC m=+6207.910854592" observedRunningTime="2025-11-26 15:14:00.52518422 +0000 UTC m=+6209.204386038" watchObservedRunningTime="2025-11-26 15:14:00.528548679 +0000 UTC m=+6209.207750497" Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.531236 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" event={"ID":"ea8c42ca-7b97-4758-834c-aecd4ea85609","Type":"ContainerStarted","Data":"6025b36049cf24a8c5b7714ec769627e54ad4bf437cc58a4bdb601752bdd28e9"} Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.533702 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-86648f486b-29tzs" event={"ID":"a65bbbf4-55fe-482b-acb0-fab23e8c3b8e","Type":"ContainerStarted","Data":"11b7489c6aaa1d674a693044e780ea5b5ecc3be62f2a9340ef475294a044b58c"} Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.558377 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-78c97476f4-dlrhp" podStartSLOduration=3.287080078 podStartE2EDuration="13.558359672s" podCreationTimestamp="2025-11-26 15:13:47 +0000 UTC" firstStartedPulling="2025-11-26 15:13:49.027064872 +0000 UTC m=+6197.706266690" lastFinishedPulling="2025-11-26 15:13:59.298344476 +0000 UTC m=+6207.977546284" observedRunningTime="2025-11-26 15:14:00.546044004 +0000 UTC m=+6209.225245842" watchObservedRunningTime="2025-11-26 15:14:00.558359672 +0000 UTC m=+6209.237561480" Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.585570 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7" podStartSLOduration=3.2985485629999998 podStartE2EDuration="13.585548544s" podCreationTimestamp="2025-11-26 15:13:47 +0000 UTC" firstStartedPulling="2025-11-26 15:13:48.841171672 +0000 UTC m=+6197.520373490" lastFinishedPulling="2025-11-26 15:13:59.128171653 +0000 UTC m=+6207.807373471" observedRunningTime="2025-11-26 15:14:00.578791725 +0000 UTC m=+6209.257993553" watchObservedRunningTime="2025-11-26 15:14:00.585548544 +0000 UTC m=+6209.264750372" Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.651220 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl" podStartSLOduration=3.292914932 podStartE2EDuration="13.651202618s" podCreationTimestamp="2025-11-26 15:13:47 +0000 UTC" firstStartedPulling="2025-11-26 15:13:48.769512778 +0000 UTC m=+6197.448714596" lastFinishedPulling="2025-11-26 15:13:59.127800464 +0000 UTC m=+6207.807002282" observedRunningTime="2025-11-26 15:14:00.647491389 +0000 UTC m=+6209.326693217" watchObservedRunningTime="2025-11-26 15:14:00.651202618 +0000 UTC m=+6209.330404436" Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.692443 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-86648f486b-29tzs" podStartSLOduration=3.122877814 podStartE2EDuration="13.692412043s" podCreationTimestamp="2025-11-26 15:13:47 +0000 UTC" firstStartedPulling="2025-11-26 15:13:48.661735904 +0000 UTC m=+6197.340937722" lastFinishedPulling="2025-11-26 15:13:59.231270133 +0000 UTC m=+6207.910471951" observedRunningTime="2025-11-26 15:14:00.672902355 +0000 UTC m=+6209.352104203" watchObservedRunningTime="2025-11-26 15:14:00.692412043 +0000 UTC m=+6209.371613861" Nov 26 15:14:00 crc kubenswrapper[5200]: I1126 15:14:00.979493 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b6498da-7f02-4d36-89d3-7ec2246a67aa" path="/var/lib/kubelet/pods/3b6498da-7f02-4d36-89d3-7ec2246a67aa/volumes" Nov 26 15:14:01 crc kubenswrapper[5200]: I1126 15:14:01.040201 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4p82p"] Nov 26 15:14:01 crc kubenswrapper[5200]: I1126 15:14:01.055345 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4p82p"] Nov 26 15:14:01 crc kubenswrapper[5200]: I1126 15:14:01.546243 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crgw" event={"ID":"c3afe83c-83ce-46ee-9b6a-4c1bee33d130","Type":"ContainerStarted","Data":"853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53"} Nov 26 15:14:02 crc kubenswrapper[5200]: I1126 15:14:02.557955 5200 generic.go:358] "Generic (PLEG): container finished" podID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerID="853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53" exitCode=0 Nov 26 15:14:02 crc kubenswrapper[5200]: I1126 15:14:02.558002 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crgw" event={"ID":"c3afe83c-83ce-46ee-9b6a-4c1bee33d130","Type":"ContainerDied","Data":"853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53"} Nov 26 15:14:02 crc kubenswrapper[5200]: I1126 15:14:02.980874 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c36d893d-1655-412a-ae4a-9ed40c6f68b2" path="/var/lib/kubelet/pods/c36d893d-1655-412a-ae4a-9ed40c6f68b2/volumes" Nov 26 15:14:03 crc kubenswrapper[5200]: I1126 15:14:03.154263 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:14:03 crc kubenswrapper[5200]: I1126 15:14:03.154605 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:14:03 crc kubenswrapper[5200]: I1126 15:14:03.808786 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tcstf" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" probeResult="failure" output=< Nov 26 15:14:03 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:14:03 crc kubenswrapper[5200]: > Nov 26 15:14:06 crc kubenswrapper[5200]: I1126 15:14:06.625945 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crgw" event={"ID":"c3afe83c-83ce-46ee-9b6a-4c1bee33d130","Type":"ContainerStarted","Data":"8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f"} Nov 26 15:14:06 crc kubenswrapper[5200]: I1126 15:14:06.662664 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2crgw" podStartSLOduration=8.009612712 podStartE2EDuration="8.662642685s" podCreationTimestamp="2025-11-26 15:13:58 +0000 UTC" firstStartedPulling="2025-11-26 15:14:00.517191568 +0000 UTC m=+6209.196393386" lastFinishedPulling="2025-11-26 15:14:01.170221541 +0000 UTC m=+6209.849423359" observedRunningTime="2025-11-26 15:14:06.653739999 +0000 UTC m=+6215.332941837" watchObservedRunningTime="2025-11-26 15:14:06.662642685 +0000 UTC m=+6215.341844503" Nov 26 15:14:08 crc kubenswrapper[5200]: I1126 15:14:08.529011 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:14:08 crc kubenswrapper[5200]: I1126 15:14:08.529334 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:14:09 crc kubenswrapper[5200]: I1126 15:14:09.577738 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2crgw" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="registry-server" probeResult="failure" output=< Nov 26 15:14:09 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:14:09 crc kubenswrapper[5200]: > Nov 26 15:14:11 crc kubenswrapper[5200]: I1126 15:14:11.549386 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-68bdb49cbf-jxcq6" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.085752 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.088961 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tcstf" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" probeResult="failure" output=< Nov 26 15:14:14 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:14:14 crc kubenswrapper[5200]: > Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.093913 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.110953 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.127734 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"telemetry-ceilometer-dockercfg-w57qm\"" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.155218 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.155445 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/openstackclient" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" containerName="openstackclient" containerID="cri-o://fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a" gracePeriod=2 Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.165487 5200 status_manager.go:895] "Failed to get status for pod" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.185569 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.226547 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clg57\" (UniqueName: \"kubernetes.io/projected/8911506d-17bb-44ea-8510-c5d547f65a28-kube-api-access-clg57\") pod \"kube-state-metrics-0\" (UID: \"8911506d-17bb-44ea-8510-c5d547f65a28\") " pod="openstack/kube-state-metrics-0" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.253353 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.255056 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" containerName="openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.255081 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" containerName="openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.255349 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" containerName="openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.265300 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.269827 5200 status_manager.go:895] "Failed to get status for pod" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" pod="openstack/openstackclient" err="pods \"openstackclient\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.273755 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.287355 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b38a0219-53c3-481a-8751-1a1e36442872" podUID="eff47485-b114-4f46-a598-85fd322a1f63" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.301452 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:14 crc kubenswrapper[5200]: E1126 15:14:14.306284 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xhsrn openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-xhsrn openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="b38a0219-53c3-481a-8751-1a1e36442872" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.330320 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.330402 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clg57\" (UniqueName: \"kubernetes.io/projected/8911506d-17bb-44ea-8510-c5d547f65a28-kube-api-access-clg57\") pod \"kube-state-metrics-0\" (UID: \"8911506d-17bb-44ea-8510-c5d547f65a28\") " pod="openstack/kube-state-metrics-0" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.330427 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config-secret\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.330576 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhsrn\" (UniqueName: \"kubernetes.io/projected/b38a0219-53c3-481a-8751-1a1e36442872-kube-api-access-xhsrn\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.339009 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.361946 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.378256 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.385744 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" podUID="eff47485-b114-4f46-a598-85fd322a1f63" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.393860 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.397942 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clg57\" (UniqueName: \"kubernetes.io/projected/8911506d-17bb-44ea-8510-c5d547f65a28-kube-api-access-clg57\") pod \"kube-state-metrics-0\" (UID: \"8911506d-17bb-44ea-8510-c5d547f65a28\") " pod="openstack/kube-state-metrics-0" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.431601 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eff47485-b114-4f46-a598-85fd322a1f63-openstack-config\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.432126 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.432195 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config-secret\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.432238 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9grl\" (UniqueName: \"kubernetes.io/projected/eff47485-b114-4f46-a598-85fd322a1f63-kube-api-access-c9grl\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.432315 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eff47485-b114-4f46-a598-85fd322a1f63-openstack-config-secret\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.432429 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhsrn\" (UniqueName: \"kubernetes.io/projected/b38a0219-53c3-481a-8751-1a1e36442872-kube-api-access-xhsrn\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.433105 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: E1126 15:14:14.441625 5200 projected.go:194] Error preparing data for projected volume kube-api-access-xhsrn for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b38a0219-53c3-481a-8751-1a1e36442872) does not match the UID in record. The object might have been deleted and then recreated Nov 26 15:14:14 crc kubenswrapper[5200]: E1126 15:14:14.441723 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b38a0219-53c3-481a-8751-1a1e36442872-kube-api-access-xhsrn podName:b38a0219-53c3-481a-8751-1a1e36442872 nodeName:}" failed. No retries permitted until 2025-11-26 15:14:14.941701737 +0000 UTC m=+6223.620903555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xhsrn" (UniqueName: "kubernetes.io/projected/b38a0219-53c3-481a-8751-1a1e36442872-kube-api-access-xhsrn") pod "openstackclient" (UID: "b38a0219-53c3-481a-8751-1a1e36442872") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b38a0219-53c3-481a-8751-1a1e36442872) does not match the UID in record. The object might have been deleted and then recreated Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.442116 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config-secret\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.476438 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.534510 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9grl\" (UniqueName: \"kubernetes.io/projected/eff47485-b114-4f46-a598-85fd322a1f63-kube-api-access-c9grl\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.534578 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eff47485-b114-4f46-a598-85fd322a1f63-openstack-config-secret\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.534741 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eff47485-b114-4f46-a598-85fd322a1f63-openstack-config\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.535463 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eff47485-b114-4f46-a598-85fd322a1f63-openstack-config\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.547648 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eff47485-b114-4f46-a598-85fd322a1f63-openstack-config-secret\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.574346 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9grl\" (UniqueName: \"kubernetes.io/projected/eff47485-b114-4f46-a598-85fd322a1f63-kube-api-access-c9grl\") pod \"openstackclient\" (UID: \"eff47485-b114-4f46-a598-85fd322a1f63\") " pod="openstack/openstackclient" Nov 26 15:14:14 crc kubenswrapper[5200]: I1126 15:14:14.835026 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:14.988308 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhsrn\" (UniqueName: \"kubernetes.io/projected/b38a0219-53c3-481a-8751-1a1e36442872-kube-api-access-xhsrn\") pod \"openstackclient\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " pod="openstack/openstackclient" Nov 26 15:14:15 crc kubenswrapper[5200]: E1126 15:14:14.995046 5200 projected.go:194] Error preparing data for projected volume kube-api-access-xhsrn for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b38a0219-53c3-481a-8751-1a1e36442872) does not match the UID in record. The object might have been deleted and then recreated Nov 26 15:14:15 crc kubenswrapper[5200]: E1126 15:14:14.995118 5200 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b38a0219-53c3-481a-8751-1a1e36442872-kube-api-access-xhsrn podName:b38a0219-53c3-481a-8751-1a1e36442872 nodeName:}" failed. No retries permitted until 2025-11-26 15:14:15.995098083 +0000 UTC m=+6224.674299911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xhsrn" (UniqueName: "kubernetes.io/projected/b38a0219-53c3-481a-8751-1a1e36442872-kube-api-access-xhsrn") pod "openstackclient" (UID: "b38a0219-53c3-481a-8751-1a1e36442872") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b38a0219-53c3-481a-8751-1a1e36442872) does not match the UID in record. The object might have been deleted and then recreated Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.186031 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.192480 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b38a0219-53c3-481a-8751-1a1e36442872" podUID="eff47485-b114-4f46-a598-85fd322a1f63" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.219394 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.249812 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b38a0219-53c3-481a-8751-1a1e36442872" podUID="eff47485-b114-4f46-a598-85fd322a1f63" Nov 26 15:14:15 crc kubenswrapper[5200]: W1126 15:14:15.324168 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8911506d_17bb_44ea_8510_c5d547f65a28.slice/crio-5664ae676aa2a2c438e9b7a880d6bac90c6fd35c47a93a919aeb7f810fdd3aab WatchSource:0}: Error finding container 5664ae676aa2a2c438e9b7a880d6bac90c6fd35c47a93a919aeb7f810fdd3aab: Status 404 returned error can't find the container with id 5664ae676aa2a2c438e9b7a880d6bac90c6fd35c47a93a919aeb7f810fdd3aab Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.327161 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config\") pod \"b38a0219-53c3-481a-8751-1a1e36442872\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.327330 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config-secret\") pod \"b38a0219-53c3-481a-8751-1a1e36442872\" (UID: \"b38a0219-53c3-481a-8751-1a1e36442872\") " Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.327881 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xhsrn\" (UniqueName: \"kubernetes.io/projected/b38a0219-53c3-481a-8751-1a1e36442872-kube-api-access-xhsrn\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.335474 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.335493 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b38a0219-53c3-481a-8751-1a1e36442872" (UID: "b38a0219-53c3-481a-8751-1a1e36442872"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.339218 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b38a0219-53c3-481a-8751-1a1e36442872" (UID: "b38a0219-53c3-481a-8751-1a1e36442872"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.430893 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.431196 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b38a0219-53c3-481a-8751-1a1e36442872-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.519008 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.526624 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.529965 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"prometheus-metric-storage\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.534062 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"prometheus-metric-storage-web-config\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.534129 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"prometheus-metric-storage-thanos-prometheus-http-client-file\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.534171 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"prometheus-metric-storage-rulefiles-0\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.534182 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"metric-storage-prometheus-dockercfg-wmp9r\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.534223 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"prometheus-metric-storage-tls-assets-0\"" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.569210 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.637080 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.637136 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.637159 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.637261 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbjp9\" (UniqueName: \"kubernetes.io/projected/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-kube-api-access-jbjp9\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.637284 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-55ce801b-fc47-48ef-947a-8667002e201a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-55ce801b-fc47-48ef-947a-8667002e201a\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.637310 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.637327 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.637356 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.681257 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.739549 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbjp9\" (UniqueName: \"kubernetes.io/projected/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-kube-api-access-jbjp9\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.739599 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-55ce801b-fc47-48ef-947a-8667002e201a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-55ce801b-fc47-48ef-947a-8667002e201a\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.739624 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.739641 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.739672 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.739838 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.739878 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.739903 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.742372 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.750381 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.752601 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.753599 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.755085 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.764483 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.786035 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbjp9\" (UniqueName: \"kubernetes.io/projected/e0fed987-4381-4dae-b6d3-c0a5f5a5a605-kube-api-access-jbjp9\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.794468 5200 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.794507 5200 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-55ce801b-fc47-48ef-947a-8667002e201a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-55ce801b-fc47-48ef-947a-8667002e201a\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7cbd96789e66b0457eb3003bdbb24c55aa033c1664ab1b32a2025d663b998695/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.971018 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.989289 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:15 crc kubenswrapper[5200]: I1126 15:14:15.990184 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-55ce801b-fc47-48ef-947a-8667002e201a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-55ce801b-fc47-48ef-947a-8667002e201a\") pod \"prometheus-metric-storage-0\" (UID: \"e0fed987-4381-4dae-b6d3-c0a5f5a5a605\") " pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.003309 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"alertmanager-metric-storage-tls-assets-0\"" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.003585 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"alertmanager-metric-storage-cluster-tls-config\"" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.003872 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"metric-storage-alertmanager-dockercfg-vzpld\"" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.004040 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"alertmanager-metric-storage-generated\"" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.004199 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"alertmanager-metric-storage-web-config\"" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.005765 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.057302 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.057424 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.057487 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.057523 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.071939 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.072152 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.072195 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svtx\" (UniqueName: \"kubernetes.io/projected/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-kube-api-access-9svtx\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.166005 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.176003 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.176114 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.176160 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.176183 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.176218 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.176291 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.176322 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9svtx\" (UniqueName: \"kubernetes.io/projected/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-kube-api-access-9svtx\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.188010 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.188321 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.225849 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.251149 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8911506d-17bb-44ea-8510-c5d547f65a28","Type":"ContainerStarted","Data":"5664ae676aa2a2c438e9b7a880d6bac90c6fd35c47a93a919aeb7f810fdd3aab"} Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.254338 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svtx\" (UniqueName: \"kubernetes.io/projected/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-kube-api-access-9svtx\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.257118 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.264136 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.272334 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.272548 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"eff47485-b114-4f46-a598-85fd322a1f63","Type":"ContainerStarted","Data":"45f3d3f9b283e70eed226fa707ece44461eb4658a11d0ed8472229fb67b2ed47"} Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.283862 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b38a0219-53c3-481a-8751-1a1e36442872" podUID="eff47485-b114-4f46-a598-85fd322a1f63" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.286317 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/12dbd5cb-56ae-4804-bfbf-e23b17e370dc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"12dbd5cb-56ae-4804-bfbf-e23b17e370dc\") " pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.296859 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b38a0219-53c3-481a-8751-1a1e36442872" podUID="eff47485-b114-4f46-a598-85fd322a1f63" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.340628 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:16 crc kubenswrapper[5200]: I1126 15:14:16.998780 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b38a0219-53c3-481a-8751-1a1e36442872" path="/var/lib/kubelet/pods/b38a0219-53c3-481a-8751-1a1e36442872/volumes" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.046350 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.114695 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw4ft\" (UniqueName: \"kubernetes.io/projected/6f802e4d-ea2f-4754-ab33-a388f2b1efef-kube-api-access-hw4ft\") pod \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.114817 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config-secret\") pod \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.114896 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config\") pod \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\" (UID: \"6f802e4d-ea2f-4754-ab33-a388f2b1efef\") " Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.126659 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.141071 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f802e4d-ea2f-4754-ab33-a388f2b1efef-kube-api-access-hw4ft" (OuterVolumeSpecName: "kube-api-access-hw4ft") pod "6f802e4d-ea2f-4754-ab33-a388f2b1efef" (UID: "6f802e4d-ea2f-4754-ab33-a388f2b1efef"). InnerVolumeSpecName "kube-api-access-hw4ft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.146351 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6f802e4d-ea2f-4754-ab33-a388f2b1efef" (UID: "6f802e4d-ea2f-4754-ab33-a388f2b1efef"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.162878 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 26 15:14:17 crc kubenswrapper[5200]: W1126 15:14:17.183101 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0fed987_4381_4dae_b6d3_c0a5f5a5a605.slice/crio-8a541395cad490e94cf276bb5098d595d05c08829196012908d753af7d4692b4 WatchSource:0}: Error finding container 8a541395cad490e94cf276bb5098d595d05c08829196012908d753af7d4692b4: Status 404 returned error can't find the container with id 8a541395cad490e94cf276bb5098d595d05c08829196012908d753af7d4692b4 Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.218013 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hw4ft\" (UniqueName: \"kubernetes.io/projected/6f802e4d-ea2f-4754-ab33-a388f2b1efef-kube-api-access-hw4ft\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.219093 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.230549 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6f802e4d-ea2f-4754-ab33-a388f2b1efef" (UID: "6f802e4d-ea2f-4754-ab33-a388f2b1efef"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.281506 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0fed987-4381-4dae-b6d3-c0a5f5a5a605","Type":"ContainerStarted","Data":"8a541395cad490e94cf276bb5098d595d05c08829196012908d753af7d4692b4"} Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.283270 5200 generic.go:358] "Generic (PLEG): container finished" podID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" containerID="fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a" exitCode=137 Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.283341 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.283361 5200 scope.go:117] "RemoveContainer" containerID="fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.287861 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"eff47485-b114-4f46-a598-85fd322a1f63","Type":"ContainerStarted","Data":"e154d3c3de26ee20dd7f0f51ad4019f2543b488573835d626336712d8a6d14b2"} Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.298745 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"12dbd5cb-56ae-4804-bfbf-e23b17e370dc","Type":"ContainerStarted","Data":"684772cc698d9596e97b1a553699e164ba3f883c540d3152e83ddef21d41f91a"} Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.314909 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8911506d-17bb-44ea-8510-c5d547f65a28","Type":"ContainerStarted","Data":"8b4a5be7f85befe377fce1531bcc781c36fb0aa763ace55789e01960c1548b6d"} Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.315708 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/kube-state-metrics-0" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.317141 5200 scope.go:117] "RemoveContainer" containerID="fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.320470 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f802e4d-ea2f-4754-ab33-a388f2b1efef-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.321063 5200 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" podUID="eff47485-b114-4f46-a598-85fd322a1f63" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.321995 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.321976852 podStartE2EDuration="3.321976852s" podCreationTimestamp="2025-11-26 15:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:14:17.313484376 +0000 UTC m=+6225.992686194" watchObservedRunningTime="2025-11-26 15:14:17.321976852 +0000 UTC m=+6226.001178670" Nov 26 15:14:17 crc kubenswrapper[5200]: E1126 15:14:17.319206 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a\": container with ID starting with fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a not found: ID does not exist" containerID="fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.323947 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a"} err="failed to get container status \"fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a\": rpc error: code = NotFound desc = could not find container \"fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a\": container with ID starting with fbc22e3971bef4ee74ad70fcc19e0dcb3a9cd9c98e959ac48391d7b6d83a7e0a not found: ID does not exist" Nov 26 15:14:17 crc kubenswrapper[5200]: I1126 15:14:17.352444 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.915333345 podStartE2EDuration="4.352427041s" podCreationTimestamp="2025-11-26 15:14:13 +0000 UTC" firstStartedPulling="2025-11-26 15:14:15.336397884 +0000 UTC m=+6224.015599692" lastFinishedPulling="2025-11-26 15:14:15.77349157 +0000 UTC m=+6224.452693388" observedRunningTime="2025-11-26 15:14:17.342294602 +0000 UTC m=+6226.021496440" watchObservedRunningTime="2025-11-26 15:14:17.352427041 +0000 UTC m=+6226.031628859" Nov 26 15:14:18 crc kubenswrapper[5200]: I1126 15:14:18.037223 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6h7j7"] Nov 26 15:14:18 crc kubenswrapper[5200]: I1126 15:14:18.047612 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6h7j7"] Nov 26 15:14:18 crc kubenswrapper[5200]: I1126 15:14:18.578336 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:14:18 crc kubenswrapper[5200]: I1126 15:14:18.630551 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:14:18 crc kubenswrapper[5200]: I1126 15:14:18.985913 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f802e4d-ea2f-4754-ab33-a388f2b1efef" path="/var/lib/kubelet/pods/6f802e4d-ea2f-4754-ab33-a388f2b1efef/volumes" Nov 26 15:14:18 crc kubenswrapper[5200]: I1126 15:14:18.987014 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afc7daf7-6707-493a-b1df-16a392e9b64c" path="/var/lib/kubelet/pods/afc7daf7-6707-493a-b1df-16a392e9b64c/volumes" Nov 26 15:14:19 crc kubenswrapper[5200]: I1126 15:14:19.190449 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crgw"] Nov 26 15:14:20 crc kubenswrapper[5200]: I1126 15:14:20.347197 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2crgw" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="registry-server" containerID="cri-o://8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f" gracePeriod=2 Nov 26 15:14:20 crc kubenswrapper[5200]: I1126 15:14:20.862528 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.008856 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-catalog-content\") pod \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.008971 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-utilities\") pod \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.009173 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb9z4\" (UniqueName: \"kubernetes.io/projected/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-kube-api-access-xb9z4\") pod \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\" (UID: \"c3afe83c-83ce-46ee-9b6a-4c1bee33d130\") " Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.009815 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-utilities" (OuterVolumeSpecName: "utilities") pod "c3afe83c-83ce-46ee-9b6a-4c1bee33d130" (UID: "c3afe83c-83ce-46ee-9b6a-4c1bee33d130"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.010628 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.017864 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3afe83c-83ce-46ee-9b6a-4c1bee33d130" (UID: "c3afe83c-83ce-46ee-9b6a-4c1bee33d130"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.022791 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-kube-api-access-xb9z4" (OuterVolumeSpecName: "kube-api-access-xb9z4") pod "c3afe83c-83ce-46ee-9b6a-4c1bee33d130" (UID: "c3afe83c-83ce-46ee-9b6a-4c1bee33d130"). InnerVolumeSpecName "kube-api-access-xb9z4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.112861 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.113145 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xb9z4\" (UniqueName: \"kubernetes.io/projected/c3afe83c-83ce-46ee-9b6a-4c1bee33d130-kube-api-access-xb9z4\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.361262 5200 generic.go:358] "Generic (PLEG): container finished" podID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerID="8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f" exitCode=0 Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.361342 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crgw" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.361337 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crgw" event={"ID":"c3afe83c-83ce-46ee-9b6a-4c1bee33d130","Type":"ContainerDied","Data":"8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f"} Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.361759 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crgw" event={"ID":"c3afe83c-83ce-46ee-9b6a-4c1bee33d130","Type":"ContainerDied","Data":"22e0a90f0563004a20d2408c6e91e1fe76a2b0dfadf98582118350dc28da4c21"} Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.361782 5200 scope.go:117] "RemoveContainer" containerID="8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.393791 5200 scope.go:117] "RemoveContainer" containerID="853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.398168 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crgw"] Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.407968 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crgw"] Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.598949 5200 scope.go:117] "RemoveContainer" containerID="ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.639093 5200 scope.go:117] "RemoveContainer" containerID="8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f" Nov 26 15:14:21 crc kubenswrapper[5200]: E1126 15:14:21.639604 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f\": container with ID starting with 8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f not found: ID does not exist" containerID="8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.639644 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f"} err="failed to get container status \"8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f\": rpc error: code = NotFound desc = could not find container \"8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f\": container with ID starting with 8388827bf61f7b21de056ed71585a1b9962630c72f882491a3ef7143c598e72f not found: ID does not exist" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.639672 5200 scope.go:117] "RemoveContainer" containerID="853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53" Nov 26 15:14:21 crc kubenswrapper[5200]: E1126 15:14:21.640175 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53\": container with ID starting with 853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53 not found: ID does not exist" containerID="853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.640197 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53"} err="failed to get container status \"853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53\": rpc error: code = NotFound desc = could not find container \"853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53\": container with ID starting with 853c549e8a64d91f96bf13634eb0aa573ed9b14a146572ee68f12aeaad874f53 not found: ID does not exist" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.640327 5200 scope.go:117] "RemoveContainer" containerID="ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca" Nov 26 15:14:21 crc kubenswrapper[5200]: E1126 15:14:21.640735 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca\": container with ID starting with ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca not found: ID does not exist" containerID="ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca" Nov 26 15:14:21 crc kubenswrapper[5200]: I1126 15:14:21.640800 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca"} err="failed to get container status \"ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca\": rpc error: code = NotFound desc = could not find container \"ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca\": container with ID starting with ff4fddd3f253e05767438b2956ef0736b898ee0a6b8369945f3e6df645cee7ca not found: ID does not exist" Nov 26 15:14:22 crc kubenswrapper[5200]: I1126 15:14:22.987806 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" path="/var/lib/kubelet/pods/c3afe83c-83ce-46ee-9b6a-4c1bee33d130/volumes" Nov 26 15:14:23 crc kubenswrapper[5200]: I1126 15:14:23.389166 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0fed987-4381-4dae-b6d3-c0a5f5a5a605","Type":"ContainerStarted","Data":"362e7bdc05ea294bac8315a79ab6ac4d0b1d43412dbc24ac497a87a8a2686e83"} Nov 26 15:14:23 crc kubenswrapper[5200]: I1126 15:14:23.392735 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"12dbd5cb-56ae-4804-bfbf-e23b17e370dc","Type":"ContainerStarted","Data":"c855df0531c2a87f10c67f968f3d45932b103cfcdfb46e20132f95ca51d1e6bc"} Nov 26 15:14:23 crc kubenswrapper[5200]: I1126 15:14:23.794917 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tcstf" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" probeResult="failure" output=< Nov 26 15:14:23 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:14:23 crc kubenswrapper[5200]: > Nov 26 15:14:29 crc kubenswrapper[5200]: I1126 15:14:29.338535 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 26 15:14:29 crc kubenswrapper[5200]: I1126 15:14:29.460670 5200 generic.go:358] "Generic (PLEG): container finished" podID="e0fed987-4381-4dae-b6d3-c0a5f5a5a605" containerID="362e7bdc05ea294bac8315a79ab6ac4d0b1d43412dbc24ac497a87a8a2686e83" exitCode=0 Nov 26 15:14:29 crc kubenswrapper[5200]: I1126 15:14:29.460897 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0fed987-4381-4dae-b6d3-c0a5f5a5a605","Type":"ContainerDied","Data":"362e7bdc05ea294bac8315a79ab6ac4d0b1d43412dbc24ac497a87a8a2686e83"} Nov 26 15:14:30 crc kubenswrapper[5200]: I1126 15:14:30.474460 5200 generic.go:358] "Generic (PLEG): container finished" podID="12dbd5cb-56ae-4804-bfbf-e23b17e370dc" containerID="c855df0531c2a87f10c67f968f3d45932b103cfcdfb46e20132f95ca51d1e6bc" exitCode=0 Nov 26 15:14:30 crc kubenswrapper[5200]: I1126 15:14:30.474570 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"12dbd5cb-56ae-4804-bfbf-e23b17e370dc","Type":"ContainerDied","Data":"c855df0531c2a87f10c67f968f3d45932b103cfcdfb46e20132f95ca51d1e6bc"} Nov 26 15:14:31 crc kubenswrapper[5200]: I1126 15:14:31.794902 5200 scope.go:117] "RemoveContainer" containerID="529a846ffe4e6f6be4eb4bedce481a90e939c3666723c3cb5b69d125f1c56c15" Nov 26 15:14:31 crc kubenswrapper[5200]: I1126 15:14:31.905763 5200 scope.go:117] "RemoveContainer" containerID="ac542146776fe8648e3c5dd64dc05b58360d195975eeab1901b6239405f1e2ab" Nov 26 15:14:31 crc kubenswrapper[5200]: I1126 15:14:31.950934 5200 scope.go:117] "RemoveContainer" containerID="124598b1ae2b78bad1a659d4876a95df98be647c2c4eacb2583aecbe29e6dfe1" Nov 26 15:14:32 crc kubenswrapper[5200]: I1126 15:14:32.022672 5200 scope.go:117] "RemoveContainer" containerID="6042d147398272a3cf66bf47c8885391a45df54976aec23eeeb0c6df8ca23b84" Nov 26 15:14:32 crc kubenswrapper[5200]: I1126 15:14:32.083804 5200 scope.go:117] "RemoveContainer" containerID="e8422a1ad238a2d8574b310c987bc9355d98c86a1506f9ffc5a57f15ddd04d3a" Nov 26 15:14:32 crc kubenswrapper[5200]: I1126 15:14:32.801419 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:14:32 crc kubenswrapper[5200]: I1126 15:14:32.868962 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:14:33 crc kubenswrapper[5200]: I1126 15:14:33.040311 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tcstf"] Nov 26 15:14:33 crc kubenswrapper[5200]: I1126 15:14:33.154844 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:14:33 crc kubenswrapper[5200]: I1126 15:14:33.154923 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:14:34 crc kubenswrapper[5200]: I1126 15:14:34.519154 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tcstf" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" containerID="cri-o://f0e3bec0f1630ef51a91e699ff620eee48c868341901491a0517f3727ce9349e" gracePeriod=2 Nov 26 15:14:35 crc kubenswrapper[5200]: I1126 15:14:35.536323 5200 generic.go:358] "Generic (PLEG): container finished" podID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerID="f0e3bec0f1630ef51a91e699ff620eee48c868341901491a0517f3727ce9349e" exitCode=0 Nov 26 15:14:35 crc kubenswrapper[5200]: I1126 15:14:35.536420 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcstf" event={"ID":"d3b5b239-875c-4a28-ba71-1e4494a31d2d","Type":"ContainerDied","Data":"f0e3bec0f1630ef51a91e699ff620eee48c868341901491a0517f3727ce9349e"} Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.150618 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.253204 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-catalog-content\") pod \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.253554 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cpdg\" (UniqueName: \"kubernetes.io/projected/d3b5b239-875c-4a28-ba71-1e4494a31d2d-kube-api-access-4cpdg\") pod \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.253596 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-utilities\") pod \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\" (UID: \"d3b5b239-875c-4a28-ba71-1e4494a31d2d\") " Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.254654 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-utilities" (OuterVolumeSpecName: "utilities") pod "d3b5b239-875c-4a28-ba71-1e4494a31d2d" (UID: "d3b5b239-875c-4a28-ba71-1e4494a31d2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.255624 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.257719 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b5b239-875c-4a28-ba71-1e4494a31d2d-kube-api-access-4cpdg" (OuterVolumeSpecName: "kube-api-access-4cpdg") pod "d3b5b239-875c-4a28-ba71-1e4494a31d2d" (UID: "d3b5b239-875c-4a28-ba71-1e4494a31d2d"). InnerVolumeSpecName "kube-api-access-4cpdg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.297428 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3b5b239-875c-4a28-ba71-1e4494a31d2d" (UID: "d3b5b239-875c-4a28-ba71-1e4494a31d2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.357501 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3b5b239-875c-4a28-ba71-1e4494a31d2d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.357536 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cpdg\" (UniqueName: \"kubernetes.io/projected/d3b5b239-875c-4a28-ba71-1e4494a31d2d-kube-api-access-4cpdg\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.549274 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"12dbd5cb-56ae-4804-bfbf-e23b17e370dc","Type":"ContainerStarted","Data":"9e5daa54f1f189efaec2c9b2f78ea0400f9295dc2bb7095f9864e5c10d89d1ff"} Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.552644 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcstf" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.552647 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcstf" event={"ID":"d3b5b239-875c-4a28-ba71-1e4494a31d2d","Type":"ContainerDied","Data":"e28ebfe88a5ed4d6d368033856b57e8a7d590b38c53d695b959b543dddf7d8b5"} Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.552748 5200 scope.go:117] "RemoveContainer" containerID="f0e3bec0f1630ef51a91e699ff620eee48c868341901491a0517f3727ce9349e" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.555669 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0fed987-4381-4dae-b6d3-c0a5f5a5a605","Type":"ContainerStarted","Data":"4be0d8b8de0c6cb4992c1b31628f106d22ed3d2565db9a2fee22871123fbd145"} Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.578045 5200 scope.go:117] "RemoveContainer" containerID="3bed4b1d83bad7eacbb288c65180be71229d1f71b50b97c71f9c043a4fe46bc4" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.591565 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tcstf"] Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.599315 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tcstf"] Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.617308 5200 scope.go:117] "RemoveContainer" containerID="3e2c6974fa8a5f6c97fe0ad3b061bb6c58f79f4186bc2e3f89cf2c7b33622be9" Nov 26 15:14:36 crc kubenswrapper[5200]: I1126 15:14:36.980644 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" path="/var/lib/kubelet/pods/d3b5b239-875c-4a28-ba71-1e4494a31d2d/volumes" Nov 26 15:14:40 crc kubenswrapper[5200]: I1126 15:14:40.615141 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0fed987-4381-4dae-b6d3-c0a5f5a5a605","Type":"ContainerStarted","Data":"c9acb527a14dcde95d087710036f2d2e93c3d95ac771c1ed179c7f2d3c6be899"} Nov 26 15:14:40 crc kubenswrapper[5200]: I1126 15:14:40.618385 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"12dbd5cb-56ae-4804-bfbf-e23b17e370dc","Type":"ContainerStarted","Data":"c84cc3fc355e013b4c5449710ad44ee317db11cc513b5224f1fca398db1d8ea2"} Nov 26 15:14:40 crc kubenswrapper[5200]: I1126 15:14:40.618703 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:40 crc kubenswrapper[5200]: I1126 15:14:40.622591 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Nov 26 15:14:40 crc kubenswrapper[5200]: I1126 15:14:40.644397 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.927291379 podStartE2EDuration="25.644382314s" podCreationTimestamp="2025-11-26 15:14:15 +0000 UTC" firstStartedPulling="2025-11-26 15:14:17.159422822 +0000 UTC m=+6225.838624640" lastFinishedPulling="2025-11-26 15:14:35.876513757 +0000 UTC m=+6244.555715575" observedRunningTime="2025-11-26 15:14:40.638075017 +0000 UTC m=+6249.317276835" watchObservedRunningTime="2025-11-26 15:14:40.644382314 +0000 UTC m=+6249.323584132" Nov 26 15:14:42 crc kubenswrapper[5200]: E1126 15:14:42.594534 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2314141 actualBytes=10240 Nov 26 15:14:44 crc kubenswrapper[5200]: I1126 15:14:44.660359 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0fed987-4381-4dae-b6d3-c0a5f5a5a605","Type":"ContainerStarted","Data":"b244b9de94df0c1d18a8a8484c07e1d1f798855e9da2365fbc6427afead78bfb"} Nov 26 15:14:44 crc kubenswrapper[5200]: I1126 15:14:44.691049 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.891118937 podStartE2EDuration="30.691029106s" podCreationTimestamp="2025-11-26 15:14:14 +0000 UTC" firstStartedPulling="2025-11-26 15:14:17.193635591 +0000 UTC m=+6225.872837399" lastFinishedPulling="2025-11-26 15:14:43.99354575 +0000 UTC m=+6252.672747568" observedRunningTime="2025-11-26 15:14:44.690520262 +0000 UTC m=+6253.369722180" watchObservedRunningTime="2025-11-26 15:14:44.691029106 +0000 UTC m=+6253.370230924" Nov 26 15:14:46 crc kubenswrapper[5200]: I1126 15:14:46.167288 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:46 crc kubenswrapper[5200]: I1126 15:14:46.167812 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:46 crc kubenswrapper[5200]: I1126 15:14:46.170360 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:46 crc kubenswrapper[5200]: I1126 15:14:46.681935 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.341130 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343016 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="extract-content" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343039 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="extract-content" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343077 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="registry-server" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343082 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="registry-server" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343094 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343100 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343111 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="extract-utilities" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343118 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="extract-utilities" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343131 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="extract-content" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343136 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="extract-content" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343159 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="extract-utilities" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343165 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="extract-utilities" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343344 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3b5b239-875c-4a28-ba71-1e4494a31d2d" containerName="registry-server" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.343356 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3afe83c-83ce-46ee-9b6a-4c1bee33d130" containerName="registry-server" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.351201 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.354635 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.356918 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.361410 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.469441 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-scripts\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.469508 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgd2h\" (UniqueName: \"kubernetes.io/projected/bb7fb9a7-0e10-4115-a440-3b320131a859-kube-api-access-dgd2h\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.469819 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.469941 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-config-data\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.470110 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-log-httpd\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.470143 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.470226 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-run-httpd\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.572687 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-run-httpd\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.572757 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-scripts\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.572816 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgd2h\" (UniqueName: \"kubernetes.io/projected/bb7fb9a7-0e10-4115-a440-3b320131a859-kube-api-access-dgd2h\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.572939 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.572992 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-config-data\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.573126 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-run-httpd\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.573229 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-log-httpd\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.573312 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.573599 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-log-httpd\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.580130 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-scripts\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.582952 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-config-data\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.588754 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.589177 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.592418 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgd2h\" (UniqueName: \"kubernetes.io/projected/bb7fb9a7-0e10-4115-a440-3b320131a859-kube-api-access-dgd2h\") pod \"ceilometer-0\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " pod="openstack/ceilometer-0" Nov 26 15:14:49 crc kubenswrapper[5200]: I1126 15:14:49.685120 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:14:50 crc kubenswrapper[5200]: I1126 15:14:50.190311 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:14:50 crc kubenswrapper[5200]: W1126 15:14:50.197485 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb7fb9a7_0e10_4115_a440_3b320131a859.slice/crio-cdd105fc5eb7447bc660f36d158e1a92e179765bd3f50812fd651fbef764bdd3 WatchSource:0}: Error finding container cdd105fc5eb7447bc660f36d158e1a92e179765bd3f50812fd651fbef764bdd3: Status 404 returned error can't find the container with id cdd105fc5eb7447bc660f36d158e1a92e179765bd3f50812fd651fbef764bdd3 Nov 26 15:14:50 crc kubenswrapper[5200]: I1126 15:14:50.739929 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerStarted","Data":"cdd105fc5eb7447bc660f36d158e1a92e179765bd3f50812fd651fbef764bdd3"} Nov 26 15:14:51 crc kubenswrapper[5200]: I1126 15:14:51.750058 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerStarted","Data":"34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192"} Nov 26 15:14:52 crc kubenswrapper[5200]: I1126 15:14:52.763144 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerStarted","Data":"27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c"} Nov 26 15:14:52 crc kubenswrapper[5200]: I1126 15:14:52.763516 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerStarted","Data":"bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2"} Nov 26 15:14:54 crc kubenswrapper[5200]: I1126 15:14:54.789573 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerStarted","Data":"b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649"} Nov 26 15:14:54 crc kubenswrapper[5200]: I1126 15:14:54.792852 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 15:14:54 crc kubenswrapper[5200]: I1126 15:14:54.823304 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.07301356 podStartE2EDuration="5.823281845s" podCreationTimestamp="2025-11-26 15:14:49 +0000 UTC" firstStartedPulling="2025-11-26 15:14:50.199364933 +0000 UTC m=+6258.878566751" lastFinishedPulling="2025-11-26 15:14:53.949633208 +0000 UTC m=+6262.628835036" observedRunningTime="2025-11-26 15:14:54.814454238 +0000 UTC m=+6263.493656066" watchObservedRunningTime="2025-11-26 15:14:54.823281845 +0000 UTC m=+6263.502483663" Nov 26 15:14:59 crc kubenswrapper[5200]: I1126 15:14:59.048168 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sdpqn"] Nov 26 15:14:59 crc kubenswrapper[5200]: I1126 15:14:59.061735 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sdpqn"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.035007 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-mtvqr"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.076041 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-mtvqr"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.076209 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.111103 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ecf3-account-create-update-t9f9q"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.132584 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ecf3-account-create-update-t9f9q"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.161844 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.213958 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.214093 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.217596 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.218065 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.222840 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ff29d-e93e-457c-a768-cdce21f92e33-operator-scripts\") pod \"aodh-db-create-mtvqr\" (UID: \"751ff29d-e93e-457c-a768-cdce21f92e33\") " pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.222908 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr8qp\" (UniqueName: \"kubernetes.io/projected/751ff29d-e93e-457c-a768-cdce21f92e33-kube-api-access-hr8qp\") pod \"aodh-db-create-mtvqr\" (UID: \"751ff29d-e93e-457c-a768-cdce21f92e33\") " pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.237557 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/aodh-62f1-account-create-update-ncmvr"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.266087 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-62f1-account-create-update-ncmvr"] Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.266309 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.271515 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"aodh-db-secret\"" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.326142 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8h8k\" (UniqueName: \"kubernetes.io/projected/042eba1e-d0a8-4058-ac19-3ead5f19889a-kube-api-access-z8h8k\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.326234 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-operator-scripts\") pod \"aodh-62f1-account-create-update-ncmvr\" (UID: \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\") " pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.326275 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ff29d-e93e-457c-a768-cdce21f92e33-operator-scripts\") pod \"aodh-db-create-mtvqr\" (UID: \"751ff29d-e93e-457c-a768-cdce21f92e33\") " pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.326327 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr8qp\" (UniqueName: \"kubernetes.io/projected/751ff29d-e93e-457c-a768-cdce21f92e33-kube-api-access-hr8qp\") pod \"aodh-db-create-mtvqr\" (UID: \"751ff29d-e93e-457c-a768-cdce21f92e33\") " pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.326361 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/042eba1e-d0a8-4058-ac19-3ead5f19889a-config-volume\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.326427 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqtks\" (UniqueName: \"kubernetes.io/projected/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-kube-api-access-qqtks\") pod \"aodh-62f1-account-create-update-ncmvr\" (UID: \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\") " pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.326464 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/042eba1e-d0a8-4058-ac19-3ead5f19889a-secret-volume\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.328075 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ff29d-e93e-457c-a768-cdce21f92e33-operator-scripts\") pod \"aodh-db-create-mtvqr\" (UID: \"751ff29d-e93e-457c-a768-cdce21f92e33\") " pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.349484 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr8qp\" (UniqueName: \"kubernetes.io/projected/751ff29d-e93e-457c-a768-cdce21f92e33-kube-api-access-hr8qp\") pod \"aodh-db-create-mtvqr\" (UID: \"751ff29d-e93e-457c-a768-cdce21f92e33\") " pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.429296 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/042eba1e-d0a8-4058-ac19-3ead5f19889a-config-volume\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.429400 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqtks\" (UniqueName: \"kubernetes.io/projected/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-kube-api-access-qqtks\") pod \"aodh-62f1-account-create-update-ncmvr\" (UID: \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\") " pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.429438 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/042eba1e-d0a8-4058-ac19-3ead5f19889a-secret-volume\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.429521 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8h8k\" (UniqueName: \"kubernetes.io/projected/042eba1e-d0a8-4058-ac19-3ead5f19889a-kube-api-access-z8h8k\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.429591 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-operator-scripts\") pod \"aodh-62f1-account-create-update-ncmvr\" (UID: \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\") " pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.430428 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/042eba1e-d0a8-4058-ac19-3ead5f19889a-config-volume\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.430470 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-operator-scripts\") pod \"aodh-62f1-account-create-update-ncmvr\" (UID: \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\") " pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.432829 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.433373 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/042eba1e-d0a8-4058-ac19-3ead5f19889a-secret-volume\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.455968 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqtks\" (UniqueName: \"kubernetes.io/projected/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-kube-api-access-qqtks\") pod \"aodh-62f1-account-create-update-ncmvr\" (UID: \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\") " pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.460403 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8h8k\" (UniqueName: \"kubernetes.io/projected/042eba1e-d0a8-4058-ac19-3ead5f19889a-kube-api-access-z8h8k\") pod \"collect-profiles-29402835-g954k\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.553163 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.591258 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.986905 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba9757e8-8ae6-4526-bb80-38e4decb481b" path="/var/lib/kubelet/pods/ba9757e8-8ae6-4526-bb80-38e4decb481b/volumes" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.988276 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db638b8d-4a3d-4c79-a7e1-70be9299bd2e" path="/var/lib/kubelet/pods/db638b8d-4a3d-4c79-a7e1-70be9299bd2e/volumes" Nov 26 15:15:00 crc kubenswrapper[5200]: I1126 15:15:00.989282 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-mtvqr"] Nov 26 15:15:00 crc kubenswrapper[5200]: W1126 15:15:00.994820 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod751ff29d_e93e_457c_a768_cdce21f92e33.slice/crio-c1078f80e1a0962ecb8def76d83edcca55db38b0077bb4bbcaf50d12879e9e44 WatchSource:0}: Error finding container c1078f80e1a0962ecb8def76d83edcca55db38b0077bb4bbcaf50d12879e9e44: Status 404 returned error can't find the container with id c1078f80e1a0962ecb8def76d83edcca55db38b0077bb4bbcaf50d12879e9e44 Nov 26 15:15:01 crc kubenswrapper[5200]: W1126 15:15:01.228243 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29c00c0c_d2c6_48d4_a6a5_3bbc89c7c552.slice/crio-d54828a15f470b1b556d17261abf55be64c2ff1a2468bf8664f8c92b658e906e WatchSource:0}: Error finding container d54828a15f470b1b556d17261abf55be64c2ff1a2468bf8664f8c92b658e906e: Status 404 returned error can't find the container with id d54828a15f470b1b556d17261abf55be64c2ff1a2468bf8664f8c92b658e906e Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.233539 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-62f1-account-create-update-ncmvr"] Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.234320 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"aodh-db-secret\"" Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.333333 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k"] Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.865376 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" event={"ID":"042eba1e-d0a8-4058-ac19-3ead5f19889a","Type":"ContainerStarted","Data":"2da214eb36e3fe4a2cca0797ccdb3f8078f1a9af42f0aff0c1c54d64c068e49d"} Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.865470 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" event={"ID":"042eba1e-d0a8-4058-ac19-3ead5f19889a","Type":"ContainerStarted","Data":"161784b0dbce04520d93233b5c0ba936ca5f0e84504834ab791715e1099d1482"} Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.867905 5200 generic.go:358] "Generic (PLEG): container finished" podID="751ff29d-e93e-457c-a768-cdce21f92e33" containerID="405e3e4dcf142cc3b989abdaf6f9edccb02a96f539f6fd49db0543441246a39f" exitCode=0 Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.867993 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mtvqr" event={"ID":"751ff29d-e93e-457c-a768-cdce21f92e33","Type":"ContainerDied","Data":"405e3e4dcf142cc3b989abdaf6f9edccb02a96f539f6fd49db0543441246a39f"} Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.868043 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mtvqr" event={"ID":"751ff29d-e93e-457c-a768-cdce21f92e33","Type":"ContainerStarted","Data":"c1078f80e1a0962ecb8def76d83edcca55db38b0077bb4bbcaf50d12879e9e44"} Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.869914 5200 generic.go:358] "Generic (PLEG): container finished" podID="29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552" containerID="8dcfe268fec275ba9af34029bc57e899701a6f48aadebd974af72142a35e637e" exitCode=0 Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.870069 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-62f1-account-create-update-ncmvr" event={"ID":"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552","Type":"ContainerDied","Data":"8dcfe268fec275ba9af34029bc57e899701a6f48aadebd974af72142a35e637e"} Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.870179 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-62f1-account-create-update-ncmvr" event={"ID":"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552","Type":"ContainerStarted","Data":"d54828a15f470b1b556d17261abf55be64c2ff1a2468bf8664f8c92b658e906e"} Nov 26 15:15:01 crc kubenswrapper[5200]: I1126 15:15:01.903390 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" podStartSLOduration=1.903370869 podStartE2EDuration="1.903370869s" podCreationTimestamp="2025-11-26 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:15:01.880720731 +0000 UTC m=+6270.559922549" watchObservedRunningTime="2025-11-26 15:15:01.903370869 +0000 UTC m=+6270.582572687" Nov 26 15:15:02 crc kubenswrapper[5200]: I1126 15:15:02.881106 5200 generic.go:358] "Generic (PLEG): container finished" podID="042eba1e-d0a8-4058-ac19-3ead5f19889a" containerID="2da214eb36e3fe4a2cca0797ccdb3f8078f1a9af42f0aff0c1c54d64c068e49d" exitCode=0 Nov 26 15:15:02 crc kubenswrapper[5200]: I1126 15:15:02.881365 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" event={"ID":"042eba1e-d0a8-4058-ac19-3ead5f19889a","Type":"ContainerDied","Data":"2da214eb36e3fe4a2cca0797ccdb3f8078f1a9af42f0aff0c1c54d64c068e49d"} Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.154818 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.154901 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.154961 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.155936 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1682d578e628d48f9c00a7583ad02f38b321739db47ca670781a2baf1284d737"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.156017 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://1682d578e628d48f9c00a7583ad02f38b321739db47ca670781a2baf1284d737" gracePeriod=600 Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.463705 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.489583 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.524435 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqtks\" (UniqueName: \"kubernetes.io/projected/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-kube-api-access-qqtks\") pod \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\" (UID: \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\") " Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.524810 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr8qp\" (UniqueName: \"kubernetes.io/projected/751ff29d-e93e-457c-a768-cdce21f92e33-kube-api-access-hr8qp\") pod \"751ff29d-e93e-457c-a768-cdce21f92e33\" (UID: \"751ff29d-e93e-457c-a768-cdce21f92e33\") " Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.524893 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-operator-scripts\") pod \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\" (UID: \"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552\") " Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.524945 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ff29d-e93e-457c-a768-cdce21f92e33-operator-scripts\") pod \"751ff29d-e93e-457c-a768-cdce21f92e33\" (UID: \"751ff29d-e93e-457c-a768-cdce21f92e33\") " Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.526227 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/751ff29d-e93e-457c-a768-cdce21f92e33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "751ff29d-e93e-457c-a768-cdce21f92e33" (UID: "751ff29d-e93e-457c-a768-cdce21f92e33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.532613 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552" (UID: "29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.538509 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751ff29d-e93e-457c-a768-cdce21f92e33-kube-api-access-hr8qp" (OuterVolumeSpecName: "kube-api-access-hr8qp") pod "751ff29d-e93e-457c-a768-cdce21f92e33" (UID: "751ff29d-e93e-457c-a768-cdce21f92e33"). InnerVolumeSpecName "kube-api-access-hr8qp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.540087 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-kube-api-access-qqtks" (OuterVolumeSpecName: "kube-api-access-qqtks") pod "29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552" (UID: "29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552"). InnerVolumeSpecName "kube-api-access-qqtks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.627613 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hr8qp\" (UniqueName: \"kubernetes.io/projected/751ff29d-e93e-457c-a768-cdce21f92e33-kube-api-access-hr8qp\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.628137 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.628153 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751ff29d-e93e-457c-a768-cdce21f92e33-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.628168 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqtks\" (UniqueName: \"kubernetes.io/projected/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552-kube-api-access-qqtks\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.898440 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mtvqr" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.898483 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mtvqr" event={"ID":"751ff29d-e93e-457c-a768-cdce21f92e33","Type":"ContainerDied","Data":"c1078f80e1a0962ecb8def76d83edcca55db38b0077bb4bbcaf50d12879e9e44"} Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.898528 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1078f80e1a0962ecb8def76d83edcca55db38b0077bb4bbcaf50d12879e9e44" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.900580 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-62f1-account-create-update-ncmvr" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.900579 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-62f1-account-create-update-ncmvr" event={"ID":"29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552","Type":"ContainerDied","Data":"d54828a15f470b1b556d17261abf55be64c2ff1a2468bf8664f8c92b658e906e"} Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.900780 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d54828a15f470b1b556d17261abf55be64c2ff1a2468bf8664f8c92b658e906e" Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.902866 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="1682d578e628d48f9c00a7583ad02f38b321739db47ca670781a2baf1284d737" exitCode=0 Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.903311 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"1682d578e628d48f9c00a7583ad02f38b321739db47ca670781a2baf1284d737"} Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.903342 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525"} Nov 26 15:15:03 crc kubenswrapper[5200]: I1126 15:15:03.903363 5200 scope.go:117] "RemoveContainer" containerID="1649fd56f5038dcc5875876b0c01f82b23ea053e2ebd1bbeb50434d8babde0dd" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.337808 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.449208 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/042eba1e-d0a8-4058-ac19-3ead5f19889a-config-volume\") pod \"042eba1e-d0a8-4058-ac19-3ead5f19889a\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.449454 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/042eba1e-d0a8-4058-ac19-3ead5f19889a-secret-volume\") pod \"042eba1e-d0a8-4058-ac19-3ead5f19889a\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.449481 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8h8k\" (UniqueName: \"kubernetes.io/projected/042eba1e-d0a8-4058-ac19-3ead5f19889a-kube-api-access-z8h8k\") pod \"042eba1e-d0a8-4058-ac19-3ead5f19889a\" (UID: \"042eba1e-d0a8-4058-ac19-3ead5f19889a\") " Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.452258 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042eba1e-d0a8-4058-ac19-3ead5f19889a-config-volume" (OuterVolumeSpecName: "config-volume") pod "042eba1e-d0a8-4058-ac19-3ead5f19889a" (UID: "042eba1e-d0a8-4058-ac19-3ead5f19889a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.456940 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042eba1e-d0a8-4058-ac19-3ead5f19889a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "042eba1e-d0a8-4058-ac19-3ead5f19889a" (UID: "042eba1e-d0a8-4058-ac19-3ead5f19889a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.461918 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042eba1e-d0a8-4058-ac19-3ead5f19889a-kube-api-access-z8h8k" (OuterVolumeSpecName: "kube-api-access-z8h8k") pod "042eba1e-d0a8-4058-ac19-3ead5f19889a" (UID: "042eba1e-d0a8-4058-ac19-3ead5f19889a"). InnerVolumeSpecName "kube-api-access-z8h8k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.552391 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/042eba1e-d0a8-4058-ac19-3ead5f19889a-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.552736 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/042eba1e-d0a8-4058-ac19-3ead5f19889a-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.552837 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8h8k\" (UniqueName: \"kubernetes.io/projected/042eba1e-d0a8-4058-ac19-3ead5f19889a-kube-api-access-z8h8k\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.919885 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.919902 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k" event={"ID":"042eba1e-d0a8-4058-ac19-3ead5f19889a","Type":"ContainerDied","Data":"161784b0dbce04520d93233b5c0ba936ca5f0e84504834ab791715e1099d1482"} Nov 26 15:15:04 crc kubenswrapper[5200]: I1126 15:15:04.919944 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="161784b0dbce04520d93233b5c0ba936ca5f0e84504834ab791715e1099d1482" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.436141 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46"] Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.451215 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402790-r7h46"] Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.465219 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-fxp72"] Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.477806 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="042eba1e-d0a8-4058-ac19-3ead5f19889a" containerName="collect-profiles" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.477844 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="042eba1e-d0a8-4058-ac19-3ead5f19889a" containerName="collect-profiles" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.477862 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="751ff29d-e93e-457c-a768-cdce21f92e33" containerName="mariadb-database-create" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.477868 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="751ff29d-e93e-457c-a768-cdce21f92e33" containerName="mariadb-database-create" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.477897 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552" containerName="mariadb-account-create-update" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.477906 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552" containerName="mariadb-account-create-update" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.478122 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="042eba1e-d0a8-4058-ac19-3ead5f19889a" containerName="collect-profiles" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.478145 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552" containerName="mariadb-account-create-update" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.478160 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="751ff29d-e93e-457c-a768-cdce21f92e33" containerName="mariadb-database-create" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.488337 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fxp72"] Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.488614 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.490866 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"aodh-config-data\"" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.491642 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"osp-secret\"" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.491660 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"aodh-scripts\"" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.499504 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"telemetry-autoscaling-dockercfg-tprdm\"" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.580686 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-scripts\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.580815 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-combined-ca-bundle\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.581188 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bd8\" (UniqueName: \"kubernetes.io/projected/dcc8058e-4b34-41fd-893f-a35dd98059fd-kube-api-access-l4bd8\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.581302 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-config-data\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.683614 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bd8\" (UniqueName: \"kubernetes.io/projected/dcc8058e-4b34-41fd-893f-a35dd98059fd-kube-api-access-l4bd8\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.684099 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-config-data\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.684261 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-scripts\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.684358 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-combined-ca-bundle\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.691919 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-scripts\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.702612 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-combined-ca-bundle\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.742968 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bd8\" (UniqueName: \"kubernetes.io/projected/dcc8058e-4b34-41fd-893f-a35dd98059fd-kube-api-access-l4bd8\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.755995 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-config-data\") pod \"aodh-db-sync-fxp72\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:05 crc kubenswrapper[5200]: I1126 15:15:05.810167 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:06 crc kubenswrapper[5200]: I1126 15:15:06.390369 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fxp72"] Nov 26 15:15:06 crc kubenswrapper[5200]: I1126 15:15:06.939802 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fxp72" event={"ID":"dcc8058e-4b34-41fd-893f-a35dd98059fd","Type":"ContainerStarted","Data":"8a772d20b317835e7e9140c30a42b30606daf5cef2ae6ee6b5d2835cce22557d"} Nov 26 15:15:06 crc kubenswrapper[5200]: I1126 15:15:06.984438 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165ddac3-3c99-4f74-abd2-4fe5be08ab73" path="/var/lib/kubelet/pods/165ddac3-3c99-4f74-abd2-4fe5be08ab73/volumes" Nov 26 15:15:09 crc kubenswrapper[5200]: I1126 15:15:09.044176 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-69jk8"] Nov 26 15:15:09 crc kubenswrapper[5200]: I1126 15:15:09.058708 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-69jk8"] Nov 26 15:15:11 crc kubenswrapper[5200]: I1126 15:15:11.020361 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70e5dae-5910-47ef-83b5-5cfcf024a27e" path="/var/lib/kubelet/pods/c70e5dae-5910-47ef-83b5-5cfcf024a27e/volumes" Nov 26 15:15:12 crc kubenswrapper[5200]: I1126 15:15:12.020908 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fxp72" event={"ID":"dcc8058e-4b34-41fd-893f-a35dd98059fd","Type":"ContainerStarted","Data":"254e39fb52b94a5955381f88436be52afdd8f843264fd6c308ce8cadec8f19f6"} Nov 26 15:15:12 crc kubenswrapper[5200]: I1126 15:15:12.045979 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-fxp72" podStartSLOduration=2.672794562 podStartE2EDuration="7.045954593s" podCreationTimestamp="2025-11-26 15:15:05 +0000 UTC" firstStartedPulling="2025-11-26 15:15:06.405268608 +0000 UTC m=+6275.084470426" lastFinishedPulling="2025-11-26 15:15:10.778428639 +0000 UTC m=+6279.457630457" observedRunningTime="2025-11-26 15:15:12.036748146 +0000 UTC m=+6280.715949964" watchObservedRunningTime="2025-11-26 15:15:12.045954593 +0000 UTC m=+6280.725156411" Nov 26 15:15:14 crc kubenswrapper[5200]: I1126 15:15:14.048431 5200 generic.go:358] "Generic (PLEG): container finished" podID="dcc8058e-4b34-41fd-893f-a35dd98059fd" containerID="254e39fb52b94a5955381f88436be52afdd8f843264fd6c308ce8cadec8f19f6" exitCode=0 Nov 26 15:15:14 crc kubenswrapper[5200]: I1126 15:15:14.048546 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fxp72" event={"ID":"dcc8058e-4b34-41fd-893f-a35dd98059fd","Type":"ContainerDied","Data":"254e39fb52b94a5955381f88436be52afdd8f843264fd6c308ce8cadec8f19f6"} Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.451342 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.557859 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-config-data\") pod \"dcc8058e-4b34-41fd-893f-a35dd98059fd\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.557918 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-scripts\") pod \"dcc8058e-4b34-41fd-893f-a35dd98059fd\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.557986 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bd8\" (UniqueName: \"kubernetes.io/projected/dcc8058e-4b34-41fd-893f-a35dd98059fd-kube-api-access-l4bd8\") pod \"dcc8058e-4b34-41fd-893f-a35dd98059fd\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.558033 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-combined-ca-bundle\") pod \"dcc8058e-4b34-41fd-893f-a35dd98059fd\" (UID: \"dcc8058e-4b34-41fd-893f-a35dd98059fd\") " Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.563874 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-scripts" (OuterVolumeSpecName: "scripts") pod "dcc8058e-4b34-41fd-893f-a35dd98059fd" (UID: "dcc8058e-4b34-41fd-893f-a35dd98059fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.564202 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc8058e-4b34-41fd-893f-a35dd98059fd-kube-api-access-l4bd8" (OuterVolumeSpecName: "kube-api-access-l4bd8") pod "dcc8058e-4b34-41fd-893f-a35dd98059fd" (UID: "dcc8058e-4b34-41fd-893f-a35dd98059fd"). InnerVolumeSpecName "kube-api-access-l4bd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.591196 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-config-data" (OuterVolumeSpecName: "config-data") pod "dcc8058e-4b34-41fd-893f-a35dd98059fd" (UID: "dcc8058e-4b34-41fd-893f-a35dd98059fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.595224 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcc8058e-4b34-41fd-893f-a35dd98059fd" (UID: "dcc8058e-4b34-41fd-893f-a35dd98059fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.677412 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l4bd8\" (UniqueName: \"kubernetes.io/projected/dcc8058e-4b34-41fd-893f-a35dd98059fd-kube-api-access-l4bd8\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.677769 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.677783 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[5200]: I1126 15:15:15.677797 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcc8058e-4b34-41fd-893f-a35dd98059fd-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:16 crc kubenswrapper[5200]: I1126 15:15:16.070666 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fxp72" Nov 26 15:15:16 crc kubenswrapper[5200]: I1126 15:15:16.070675 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fxp72" event={"ID":"dcc8058e-4b34-41fd-893f-a35dd98059fd","Type":"ContainerDied","Data":"8a772d20b317835e7e9140c30a42b30606daf5cef2ae6ee6b5d2835cce22557d"} Nov 26 15:15:16 crc kubenswrapper[5200]: I1126 15:15:16.070765 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a772d20b317835e7e9140c30a42b30606daf5cef2ae6ee6b5d2835cce22557d" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.076797 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.078638 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcc8058e-4b34-41fd-893f-a35dd98059fd" containerName="aodh-db-sync" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.078665 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc8058e-4b34-41fd-893f-a35dd98059fd" containerName="aodh-db-sync" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.079001 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcc8058e-4b34-41fd-893f-a35dd98059fd" containerName="aodh-db-sync" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.086128 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.089188 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"telemetry-autoscaling-dockercfg-tprdm\"" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.089451 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"aodh-scripts\"" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.089802 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"aodh-config-data\"" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.105379 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.186226 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-config-data\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.186600 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-combined-ca-bundle\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.186648 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-scripts\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.186760 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2spp\" (UniqueName: \"kubernetes.io/projected/826dcf1f-7136-4b7f-b5c0-66788d96f2df-kube-api-access-l2spp\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.288358 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-config-data\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.288419 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-combined-ca-bundle\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.288453 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-scripts\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.288527 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2spp\" (UniqueName: \"kubernetes.io/projected/826dcf1f-7136-4b7f-b5c0-66788d96f2df-kube-api-access-l2spp\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.298170 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-scripts\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.298722 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-combined-ca-bundle\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.300062 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826dcf1f-7136-4b7f-b5c0-66788d96f2df-config-data\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.313378 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2spp\" (UniqueName: \"kubernetes.io/projected/826dcf1f-7136-4b7f-b5c0-66788d96f2df-kube-api-access-l2spp\") pod \"aodh-0\" (UID: \"826dcf1f-7136-4b7f-b5c0-66788d96f2df\") " pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.416645 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 26 15:15:20 crc kubenswrapper[5200]: I1126 15:15:20.904869 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 26 15:15:21 crc kubenswrapper[5200]: I1126 15:15:21.129437 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"826dcf1f-7136-4b7f-b5c0-66788d96f2df","Type":"ContainerStarted","Data":"c0582ed4d002f47293bf4c76bfdc22027ae92bba0402d8a4241ade977026633d"} Nov 26 15:15:21 crc kubenswrapper[5200]: I1126 15:15:21.277710 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:15:21 crc kubenswrapper[5200]: I1126 15:15:21.278377 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="ceilometer-central-agent" containerID="cri-o://34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192" gracePeriod=30 Nov 26 15:15:21 crc kubenswrapper[5200]: I1126 15:15:21.278429 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="ceilometer-notification-agent" containerID="cri-o://bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2" gracePeriod=30 Nov 26 15:15:21 crc kubenswrapper[5200]: I1126 15:15:21.278440 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="sg-core" containerID="cri-o://27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c" gracePeriod=30 Nov 26 15:15:21 crc kubenswrapper[5200]: I1126 15:15:21.278450 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="proxy-httpd" containerID="cri-o://b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649" gracePeriod=30 Nov 26 15:15:21 crc kubenswrapper[5200]: I1126 15:15:21.284199 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.134:3000/\": EOF" Nov 26 15:15:22 crc kubenswrapper[5200]: I1126 15:15:22.141678 5200 generic.go:358] "Generic (PLEG): container finished" podID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerID="b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649" exitCode=0 Nov 26 15:15:22 crc kubenswrapper[5200]: I1126 15:15:22.142117 5200 generic.go:358] "Generic (PLEG): container finished" podID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerID="27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c" exitCode=2 Nov 26 15:15:22 crc kubenswrapper[5200]: I1126 15:15:22.142130 5200 generic.go:358] "Generic (PLEG): container finished" podID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerID="34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192" exitCode=0 Nov 26 15:15:22 crc kubenswrapper[5200]: I1126 15:15:22.141713 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerDied","Data":"b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649"} Nov 26 15:15:22 crc kubenswrapper[5200]: I1126 15:15:22.142175 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerDied","Data":"27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c"} Nov 26 15:15:22 crc kubenswrapper[5200]: I1126 15:15:22.142187 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerDied","Data":"34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192"} Nov 26 15:15:22 crc kubenswrapper[5200]: I1126 15:15:22.146050 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"826dcf1f-7136-4b7f-b5c0-66788d96f2df","Type":"ContainerStarted","Data":"ad1937bf9aeda209bd9dbc40124278592cd62d1ec79b8f1282b902a3bd635ef5"} Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.181114 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"826dcf1f-7136-4b7f-b5c0-66788d96f2df","Type":"ContainerStarted","Data":"600f2985d10970edabf1610005493666e492c1c08093dd0a2fdf26d699ad1e4d"} Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.681706 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.821969 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-scripts\") pod \"bb7fb9a7-0e10-4115-a440-3b320131a859\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.822019 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-sg-core-conf-yaml\") pod \"bb7fb9a7-0e10-4115-a440-3b320131a859\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.822086 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-log-httpd\") pod \"bb7fb9a7-0e10-4115-a440-3b320131a859\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.822115 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-run-httpd\") pod \"bb7fb9a7-0e10-4115-a440-3b320131a859\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.822184 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-config-data\") pod \"bb7fb9a7-0e10-4115-a440-3b320131a859\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.822415 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-combined-ca-bundle\") pod \"bb7fb9a7-0e10-4115-a440-3b320131a859\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.823093 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgd2h\" (UniqueName: \"kubernetes.io/projected/bb7fb9a7-0e10-4115-a440-3b320131a859-kube-api-access-dgd2h\") pod \"bb7fb9a7-0e10-4115-a440-3b320131a859\" (UID: \"bb7fb9a7-0e10-4115-a440-3b320131a859\") " Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.826226 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb7fb9a7-0e10-4115-a440-3b320131a859" (UID: "bb7fb9a7-0e10-4115-a440-3b320131a859"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.826931 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb7fb9a7-0e10-4115-a440-3b320131a859" (UID: "bb7fb9a7-0e10-4115-a440-3b320131a859"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.831675 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7fb9a7-0e10-4115-a440-3b320131a859-kube-api-access-dgd2h" (OuterVolumeSpecName: "kube-api-access-dgd2h") pod "bb7fb9a7-0e10-4115-a440-3b320131a859" (UID: "bb7fb9a7-0e10-4115-a440-3b320131a859"). InnerVolumeSpecName "kube-api-access-dgd2h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.831988 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-scripts" (OuterVolumeSpecName: "scripts") pod "bb7fb9a7-0e10-4115-a440-3b320131a859" (UID: "bb7fb9a7-0e10-4115-a440-3b320131a859"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.874032 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb7fb9a7-0e10-4115-a440-3b320131a859" (UID: "bb7fb9a7-0e10-4115-a440-3b320131a859"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.916918 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb7fb9a7-0e10-4115-a440-3b320131a859" (UID: "bb7fb9a7-0e10-4115-a440-3b320131a859"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.929094 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.929258 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.929339 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.929404 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7fb9a7-0e10-4115-a440-3b320131a859-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.929481 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.929563 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dgd2h\" (UniqueName: \"kubernetes.io/projected/bb7fb9a7-0e10-4115-a440-3b320131a859-kube-api-access-dgd2h\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:24 crc kubenswrapper[5200]: I1126 15:15:24.941546 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-config-data" (OuterVolumeSpecName: "config-data") pod "bb7fb9a7-0e10-4115-a440-3b320131a859" (UID: "bb7fb9a7-0e10-4115-a440-3b320131a859"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.032159 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7fb9a7-0e10-4115-a440-3b320131a859-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.195351 5200 generic.go:358] "Generic (PLEG): container finished" podID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerID="bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2" exitCode=0 Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.195713 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerDied","Data":"bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2"} Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.195743 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7fb9a7-0e10-4115-a440-3b320131a859","Type":"ContainerDied","Data":"cdd105fc5eb7447bc660f36d158e1a92e179765bd3f50812fd651fbef764bdd3"} Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.195760 5200 scope.go:117] "RemoveContainer" containerID="b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.195991 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.246335 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.267586 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.277911 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280206 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="ceilometer-central-agent" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280244 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="ceilometer-central-agent" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280293 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="sg-core" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280304 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="sg-core" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280326 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="proxy-httpd" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280338 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="proxy-httpd" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280353 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="ceilometer-notification-agent" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280361 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="ceilometer-notification-agent" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280729 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="ceilometer-central-agent" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280767 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="ceilometer-notification-agent" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280804 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="proxy-httpd" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.280816 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" containerName="sg-core" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.291039 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.291215 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.294996 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.295175 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.431960 5200 scope.go:117] "RemoveContainer" containerID="27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.442362 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.442499 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-log-httpd\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.442582 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-scripts\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.442670 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-config-data\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.442721 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-run-httpd\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.442810 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.442848 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xhc\" (UniqueName: \"kubernetes.io/projected/889ff754-6fa4-4704-873f-f5606e3d8457-kube-api-access-f8xhc\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.491381 5200 scope.go:117] "RemoveContainer" containerID="bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.527241 5200 scope.go:117] "RemoveContainer" containerID="34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.544735 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-scripts\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.545090 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-config-data\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.545139 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-run-httpd\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.545225 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.545255 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8xhc\" (UniqueName: \"kubernetes.io/projected/889ff754-6fa4-4704-873f-f5606e3d8457-kube-api-access-f8xhc\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.545331 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.545421 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-log-httpd\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.545870 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-run-httpd\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.546927 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-log-httpd\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.550735 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.551901 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.551901 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-config-data\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.551969 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-scripts\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.562061 5200 scope.go:117] "RemoveContainer" containerID="b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649" Nov 26 15:15:25 crc kubenswrapper[5200]: E1126 15:15:25.562437 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649\": container with ID starting with b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649 not found: ID does not exist" containerID="b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.562468 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649"} err="failed to get container status \"b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649\": rpc error: code = NotFound desc = could not find container \"b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649\": container with ID starting with b68e52c1faf9f01d138cba4d59b53c3014dec5298aa44831734d27a2304d2649 not found: ID does not exist" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.562490 5200 scope.go:117] "RemoveContainer" containerID="27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c" Nov 26 15:15:25 crc kubenswrapper[5200]: E1126 15:15:25.562833 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c\": container with ID starting with 27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c not found: ID does not exist" containerID="27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.562854 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c"} err="failed to get container status \"27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c\": rpc error: code = NotFound desc = could not find container \"27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c\": container with ID starting with 27097e7b865701662ea20a368958df69be031572ec04670331c937ee542e294c not found: ID does not exist" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.562866 5200 scope.go:117] "RemoveContainer" containerID="bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2" Nov 26 15:15:25 crc kubenswrapper[5200]: E1126 15:15:25.563443 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2\": container with ID starting with bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2 not found: ID does not exist" containerID="bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.563464 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2"} err="failed to get container status \"bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2\": rpc error: code = NotFound desc = could not find container \"bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2\": container with ID starting with bfa1ad93ccbdac186459dff5c725710127ca968850e18d6e32eb5a5a9c72d9f2 not found: ID does not exist" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.563477 5200 scope.go:117] "RemoveContainer" containerID="34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192" Nov 26 15:15:25 crc kubenswrapper[5200]: E1126 15:15:25.563728 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192\": container with ID starting with 34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192 not found: ID does not exist" containerID="34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.563749 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192"} err="failed to get container status \"34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192\": rpc error: code = NotFound desc = could not find container \"34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192\": container with ID starting with 34d5fe6567bbf965c84903b782c68c79cbe2cf47c6120a4b905e58ef01dc3192 not found: ID does not exist" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.565287 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8xhc\" (UniqueName: \"kubernetes.io/projected/889ff754-6fa4-4704-873f-f5606e3d8457-kube-api-access-f8xhc\") pod \"ceilometer-0\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " pod="openstack/ceilometer-0" Nov 26 15:15:25 crc kubenswrapper[5200]: I1126 15:15:25.633160 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:15:26 crc kubenswrapper[5200]: I1126 15:15:26.210664 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"826dcf1f-7136-4b7f-b5c0-66788d96f2df","Type":"ContainerStarted","Data":"73b71083e8e7658f156ee39e87ed39b80af0fbbf2905228ce2f4e31ef61a8d1a"} Nov 26 15:15:26 crc kubenswrapper[5200]: I1126 15:15:26.233986 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:15:26 crc kubenswrapper[5200]: I1126 15:15:26.980346 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7fb9a7-0e10-4115-a440-3b320131a859" path="/var/lib/kubelet/pods/bb7fb9a7-0e10-4115-a440-3b320131a859/volumes" Nov 26 15:15:27 crc kubenswrapper[5200]: I1126 15:15:27.221987 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerStarted","Data":"77a0b0a3db804e1d45ec385430a59e4cef2e16fcd8f514d872e257296be5c55e"} Nov 26 15:15:28 crc kubenswrapper[5200]: I1126 15:15:28.242534 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerStarted","Data":"b641993913e79c393cfb1d33499961aa10388c951b5013382479dd8008a87f72"} Nov 26 15:15:29 crc kubenswrapper[5200]: I1126 15:15:29.257392 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"826dcf1f-7136-4b7f-b5c0-66788d96f2df","Type":"ContainerStarted","Data":"dd93d2a75af11d6931b384c2eb2337ed23cac0eafdef40bfc465c293a68bf1d3"} Nov 26 15:15:29 crc kubenswrapper[5200]: I1126 15:15:29.289000 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.525105583 podStartE2EDuration="9.288967327s" podCreationTimestamp="2025-11-26 15:15:20 +0000 UTC" firstStartedPulling="2025-11-26 15:15:20.904876349 +0000 UTC m=+6289.584078167" lastFinishedPulling="2025-11-26 15:15:28.668738093 +0000 UTC m=+6297.347939911" observedRunningTime="2025-11-26 15:15:29.281881736 +0000 UTC m=+6297.961083554" watchObservedRunningTime="2025-11-26 15:15:29.288967327 +0000 UTC m=+6297.968169145" Nov 26 15:15:30 crc kubenswrapper[5200]: I1126 15:15:30.278557 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerStarted","Data":"cd79668a5757e2d24c79c3140cdab3b54b4e81d6aebdf618d24db3f6d30cf8ba"} Nov 26 15:15:30 crc kubenswrapper[5200]: I1126 15:15:30.279605 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerStarted","Data":"e3eed3351dd891170c5166e0d17162938684d0206cd43423dbfee3378d59cf9b"} Nov 26 15:15:32 crc kubenswrapper[5200]: I1126 15:15:32.297497 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerStarted","Data":"5d171398d3a557f4a4431ba2ad40fcb8024e812b94e45adc4e81ab8ddb1445c0"} Nov 26 15:15:32 crc kubenswrapper[5200]: I1126 15:15:32.298313 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 15:15:32 crc kubenswrapper[5200]: I1126 15:15:32.326231 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.170621088 podStartE2EDuration="7.326177377s" podCreationTimestamp="2025-11-26 15:15:25 +0000 UTC" firstStartedPulling="2025-11-26 15:15:26.248508139 +0000 UTC m=+6294.927709957" lastFinishedPulling="2025-11-26 15:15:31.404064428 +0000 UTC m=+6300.083266246" observedRunningTime="2025-11-26 15:15:32.314404311 +0000 UTC m=+6300.993606129" watchObservedRunningTime="2025-11-26 15:15:32.326177377 +0000 UTC m=+6301.005379225" Nov 26 15:15:32 crc kubenswrapper[5200]: I1126 15:15:32.545625 5200 scope.go:117] "RemoveContainer" containerID="014be205dd250d19db454eb3e8bedfbea45f4f5af81598a86843b5a0fac5f517" Nov 26 15:15:32 crc kubenswrapper[5200]: I1126 15:15:32.574466 5200 scope.go:117] "RemoveContainer" containerID="c4084af00de440b6712c59eb8eaf4a9a0c9391ad8255782c9f3b99df82884fed" Nov 26 15:15:32 crc kubenswrapper[5200]: I1126 15:15:32.623054 5200 scope.go:117] "RemoveContainer" containerID="80ca65d6b46b71f1284b057212fefbee1b899841c85115c98cae1ed6dcd50b96" Nov 26 15:15:32 crc kubenswrapper[5200]: I1126 15:15:32.689732 5200 scope.go:117] "RemoveContainer" containerID="1b107f87efe6d2a973ca9408110e56c1cd2d7e3ff0c6b1477ea7fa46c7badb0a" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.684489 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-5nj7t"] Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.696468 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5nj7t"] Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.696632 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.757268 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/manila-83fd-account-create-update-h5bvf"] Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.764778 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.767039 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-db-secret\"" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.770210 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-83fd-account-create-update-h5bvf"] Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.807949 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmxv5\" (UniqueName: \"kubernetes.io/projected/79ca5afe-27f4-40ac-abec-e35e45e94345-kube-api-access-nmxv5\") pod \"manila-db-create-5nj7t\" (UID: \"79ca5afe-27f4-40ac-abec-e35e45e94345\") " pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.808006 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79ca5afe-27f4-40ac-abec-e35e45e94345-operator-scripts\") pod \"manila-db-create-5nj7t\" (UID: \"79ca5afe-27f4-40ac-abec-e35e45e94345\") " pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.909671 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79ca5afe-27f4-40ac-abec-e35e45e94345-operator-scripts\") pod \"manila-db-create-5nj7t\" (UID: \"79ca5afe-27f4-40ac-abec-e35e45e94345\") " pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.909776 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntmtg\" (UniqueName: \"kubernetes.io/projected/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-kube-api-access-ntmtg\") pod \"manila-83fd-account-create-update-h5bvf\" (UID: \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\") " pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.910359 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-operator-scripts\") pod \"manila-83fd-account-create-update-h5bvf\" (UID: \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\") " pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.910457 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79ca5afe-27f4-40ac-abec-e35e45e94345-operator-scripts\") pod \"manila-db-create-5nj7t\" (UID: \"79ca5afe-27f4-40ac-abec-e35e45e94345\") " pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.910756 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmxv5\" (UniqueName: \"kubernetes.io/projected/79ca5afe-27f4-40ac-abec-e35e45e94345-kube-api-access-nmxv5\") pod \"manila-db-create-5nj7t\" (UID: \"79ca5afe-27f4-40ac-abec-e35e45e94345\") " pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:35 crc kubenswrapper[5200]: I1126 15:15:35.942419 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmxv5\" (UniqueName: \"kubernetes.io/projected/79ca5afe-27f4-40ac-abec-e35e45e94345-kube-api-access-nmxv5\") pod \"manila-db-create-5nj7t\" (UID: \"79ca5afe-27f4-40ac-abec-e35e45e94345\") " pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:36 crc kubenswrapper[5200]: I1126 15:15:36.018882 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-operator-scripts\") pod \"manila-83fd-account-create-update-h5bvf\" (UID: \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\") " pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:36 crc kubenswrapper[5200]: I1126 15:15:36.019094 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntmtg\" (UniqueName: \"kubernetes.io/projected/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-kube-api-access-ntmtg\") pod \"manila-83fd-account-create-update-h5bvf\" (UID: \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\") " pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:36 crc kubenswrapper[5200]: I1126 15:15:36.020287 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-operator-scripts\") pod \"manila-83fd-account-create-update-h5bvf\" (UID: \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\") " pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:36 crc kubenswrapper[5200]: I1126 15:15:36.039500 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntmtg\" (UniqueName: \"kubernetes.io/projected/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-kube-api-access-ntmtg\") pod \"manila-83fd-account-create-update-h5bvf\" (UID: \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\") " pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:36 crc kubenswrapper[5200]: I1126 15:15:36.048668 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:36 crc kubenswrapper[5200]: I1126 15:15:36.081932 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:36 crc kubenswrapper[5200]: I1126 15:15:36.632651 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5nj7t"] Nov 26 15:15:36 crc kubenswrapper[5200]: I1126 15:15:36.735748 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-83fd-account-create-update-h5bvf"] Nov 26 15:15:36 crc kubenswrapper[5200]: W1126 15:15:36.742851 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87cc06a3_07fc_407c_91bf_0e5bfa8d2bcd.slice/crio-b6bfa008e6c3fe20e871776a77136a019c11e157f424968a9a50fc57a8c721dd WatchSource:0}: Error finding container b6bfa008e6c3fe20e871776a77136a019c11e157f424968a9a50fc57a8c721dd: Status 404 returned error can't find the container with id b6bfa008e6c3fe20e871776a77136a019c11e157f424968a9a50fc57a8c721dd Nov 26 15:15:37 crc kubenswrapper[5200]: E1126 15:15:37.302861 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79ca5afe_27f4_40ac_abec_e35e45e94345.slice/crio-conmon-00269e0f8dcf50852e327a81bf08b1abb3a03d96ec4f11c9e43e43a4c9bfe82b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87cc06a3_07fc_407c_91bf_0e5bfa8d2bcd.slice/crio-3dc00532f78d02d4ee53b5fab266967b08754b4d9b71efa4bd32bcef72847ce5.scope\": RecentStats: unable to find data in memory cache]" Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.379716 5200 generic.go:358] "Generic (PLEG): container finished" podID="79ca5afe-27f4-40ac-abec-e35e45e94345" containerID="00269e0f8dcf50852e327a81bf08b1abb3a03d96ec4f11c9e43e43a4c9bfe82b" exitCode=0 Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.379835 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5nj7t" event={"ID":"79ca5afe-27f4-40ac-abec-e35e45e94345","Type":"ContainerDied","Data":"00269e0f8dcf50852e327a81bf08b1abb3a03d96ec4f11c9e43e43a4c9bfe82b"} Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.379901 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5nj7t" event={"ID":"79ca5afe-27f4-40ac-abec-e35e45e94345","Type":"ContainerStarted","Data":"6d5d139ad39a2029c5041645401322ac30bc27533a3698fb861957c933c9793b"} Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.382907 5200 generic.go:358] "Generic (PLEG): container finished" podID="87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd" containerID="3dc00532f78d02d4ee53b5fab266967b08754b4d9b71efa4bd32bcef72847ce5" exitCode=0 Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.383473 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-83fd-account-create-update-h5bvf" event={"ID":"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd","Type":"ContainerDied","Data":"3dc00532f78d02d4ee53b5fab266967b08754b4d9b71efa4bd32bcef72847ce5"} Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.383514 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-83fd-account-create-update-h5bvf" event={"ID":"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd","Type":"ContainerStarted","Data":"b6bfa008e6c3fe20e871776a77136a019c11e157f424968a9a50fc57a8c721dd"} Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.693831 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.693834 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.701907 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:15:37 crc kubenswrapper[5200]: I1126 15:15:37.701913 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:15:38 crc kubenswrapper[5200]: I1126 15:15:38.953303 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:38 crc kubenswrapper[5200]: I1126 15:15:38.962624 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.091361 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-operator-scripts\") pod \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\" (UID: \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\") " Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.092001 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79ca5afe-27f4-40ac-abec-e35e45e94345-operator-scripts\") pod \"79ca5afe-27f4-40ac-abec-e35e45e94345\" (UID: \"79ca5afe-27f4-40ac-abec-e35e45e94345\") " Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.092055 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntmtg\" (UniqueName: \"kubernetes.io/projected/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-kube-api-access-ntmtg\") pod \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\" (UID: \"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd\") " Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.092080 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmxv5\" (UniqueName: \"kubernetes.io/projected/79ca5afe-27f4-40ac-abec-e35e45e94345-kube-api-access-nmxv5\") pod \"79ca5afe-27f4-40ac-abec-e35e45e94345\" (UID: \"79ca5afe-27f4-40ac-abec-e35e45e94345\") " Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.092068 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd" (UID: "87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.092483 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ca5afe-27f4-40ac-abec-e35e45e94345-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79ca5afe-27f4-40ac-abec-e35e45e94345" (UID: "79ca5afe-27f4-40ac-abec-e35e45e94345"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.093744 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.093771 5200 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79ca5afe-27f4-40ac-abec-e35e45e94345-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.099657 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ca5afe-27f4-40ac-abec-e35e45e94345-kube-api-access-nmxv5" (OuterVolumeSpecName: "kube-api-access-nmxv5") pod "79ca5afe-27f4-40ac-abec-e35e45e94345" (UID: "79ca5afe-27f4-40ac-abec-e35e45e94345"). InnerVolumeSpecName "kube-api-access-nmxv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.101325 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-kube-api-access-ntmtg" (OuterVolumeSpecName: "kube-api-access-ntmtg") pod "87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd" (UID: "87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd"). InnerVolumeSpecName "kube-api-access-ntmtg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.196282 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntmtg\" (UniqueName: \"kubernetes.io/projected/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd-kube-api-access-ntmtg\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.196634 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmxv5\" (UniqueName: \"kubernetes.io/projected/79ca5afe-27f4-40ac-abec-e35e45e94345-kube-api-access-nmxv5\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.405482 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5nj7t" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.405499 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5nj7t" event={"ID":"79ca5afe-27f4-40ac-abec-e35e45e94345","Type":"ContainerDied","Data":"6d5d139ad39a2029c5041645401322ac30bc27533a3698fb861957c933c9793b"} Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.405555 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d5d139ad39a2029c5041645401322ac30bc27533a3698fb861957c933c9793b" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.407279 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-83fd-account-create-update-h5bvf" event={"ID":"87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd","Type":"ContainerDied","Data":"b6bfa008e6c3fe20e871776a77136a019c11e157f424968a9a50fc57a8c721dd"} Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.407324 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6bfa008e6c3fe20e871776a77136a019c11e157f424968a9a50fc57a8c721dd" Nov 26 15:15:39 crc kubenswrapper[5200]: I1126 15:15:39.407352 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-83fd-account-create-update-h5bvf" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.052500 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-k77xb"] Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.054912 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79ca5afe-27f4-40ac-abec-e35e45e94345" containerName="mariadb-database-create" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.055056 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ca5afe-27f4-40ac-abec-e35e45e94345" containerName="mariadb-database-create" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.055194 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd" containerName="mariadb-account-create-update" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.055266 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd" containerName="mariadb-account-create-update" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.056098 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="79ca5afe-27f4-40ac-abec-e35e45e94345" containerName="mariadb-database-create" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.056192 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd" containerName="mariadb-account-create-update" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.146530 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-k77xb"] Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.146665 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.149822 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-config-data\"" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.150630 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-manila-dockercfg-drvzl\"" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.285565 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-job-config-data\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.285648 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdqbx\" (UniqueName: \"kubernetes.io/projected/c278290f-ff8f-407d-bdd9-78cbb944ad57-kube-api-access-zdqbx\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.285781 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-config-data\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.285828 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-combined-ca-bundle\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.387666 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-job-config-data\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.387757 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdqbx\" (UniqueName: \"kubernetes.io/projected/c278290f-ff8f-407d-bdd9-78cbb944ad57-kube-api-access-zdqbx\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.387859 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-config-data\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.387902 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-combined-ca-bundle\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.393856 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-job-config-data\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.394985 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-combined-ca-bundle\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.408463 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-config-data\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.409747 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdqbx\" (UniqueName: \"kubernetes.io/projected/c278290f-ff8f-407d-bdd9-78cbb944ad57-kube-api-access-zdqbx\") pod \"manila-db-sync-k77xb\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:41 crc kubenswrapper[5200]: I1126 15:15:41.466394 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:42 crc kubenswrapper[5200]: I1126 15:15:42.517982 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-k77xb"] Nov 26 15:15:42 crc kubenswrapper[5200]: W1126 15:15:42.529324 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc278290f_ff8f_407d_bdd9_78cbb944ad57.slice/crio-fc9e3aa91d5bc592488db90d215ad2d50acda0975b9ad649263b5988a3b99817 WatchSource:0}: Error finding container fc9e3aa91d5bc592488db90d215ad2d50acda0975b9ad649263b5988a3b99817: Status 404 returned error can't find the container with id fc9e3aa91d5bc592488db90d215ad2d50acda0975b9ad649263b5988a3b99817 Nov 26 15:15:43 crc kubenswrapper[5200]: E1126 15:15:43.034794 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2329676 actualBytes=10240 Nov 26 15:15:43 crc kubenswrapper[5200]: I1126 15:15:43.449723 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k77xb" event={"ID":"c278290f-ff8f-407d-bdd9-78cbb944ad57","Type":"ContainerStarted","Data":"fc9e3aa91d5bc592488db90d215ad2d50acda0975b9ad649263b5988a3b99817"} Nov 26 15:15:48 crc kubenswrapper[5200]: I1126 15:15:48.509790 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k77xb" event={"ID":"c278290f-ff8f-407d-bdd9-78cbb944ad57","Type":"ContainerStarted","Data":"aa64e4429244f42b2661ba7239b3bce128afeec7c704aaab31660f819f2468e2"} Nov 26 15:15:48 crc kubenswrapper[5200]: I1126 15:15:48.529987 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-k77xb" podStartSLOduration=2.868566215 podStartE2EDuration="7.529964767s" podCreationTimestamp="2025-11-26 15:15:41 +0000 UTC" firstStartedPulling="2025-11-26 15:15:42.538200965 +0000 UTC m=+6311.217402783" lastFinishedPulling="2025-11-26 15:15:47.199599527 +0000 UTC m=+6315.878801335" observedRunningTime="2025-11-26 15:15:48.523097893 +0000 UTC m=+6317.202299731" watchObservedRunningTime="2025-11-26 15:15:48.529964767 +0000 UTC m=+6317.209166585" Nov 26 15:15:50 crc kubenswrapper[5200]: I1126 15:15:50.532183 5200 generic.go:358] "Generic (PLEG): container finished" podID="c278290f-ff8f-407d-bdd9-78cbb944ad57" containerID="aa64e4429244f42b2661ba7239b3bce128afeec7c704aaab31660f819f2468e2" exitCode=0 Nov 26 15:15:50 crc kubenswrapper[5200]: I1126 15:15:50.532271 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k77xb" event={"ID":"c278290f-ff8f-407d-bdd9-78cbb944ad57","Type":"ContainerDied","Data":"aa64e4429244f42b2661ba7239b3bce128afeec7c704aaab31660f819f2468e2"} Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.191916 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.339629 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-combined-ca-bundle\") pod \"c278290f-ff8f-407d-bdd9-78cbb944ad57\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.340044 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-config-data\") pod \"c278290f-ff8f-407d-bdd9-78cbb944ad57\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.340155 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdqbx\" (UniqueName: \"kubernetes.io/projected/c278290f-ff8f-407d-bdd9-78cbb944ad57-kube-api-access-zdqbx\") pod \"c278290f-ff8f-407d-bdd9-78cbb944ad57\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.340186 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-job-config-data\") pod \"c278290f-ff8f-407d-bdd9-78cbb944ad57\" (UID: \"c278290f-ff8f-407d-bdd9-78cbb944ad57\") " Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.346078 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "c278290f-ff8f-407d-bdd9-78cbb944ad57" (UID: "c278290f-ff8f-407d-bdd9-78cbb944ad57"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.347118 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c278290f-ff8f-407d-bdd9-78cbb944ad57-kube-api-access-zdqbx" (OuterVolumeSpecName: "kube-api-access-zdqbx") pod "c278290f-ff8f-407d-bdd9-78cbb944ad57" (UID: "c278290f-ff8f-407d-bdd9-78cbb944ad57"). InnerVolumeSpecName "kube-api-access-zdqbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.349747 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-config-data" (OuterVolumeSpecName: "config-data") pod "c278290f-ff8f-407d-bdd9-78cbb944ad57" (UID: "c278290f-ff8f-407d-bdd9-78cbb944ad57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.374487 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c278290f-ff8f-407d-bdd9-78cbb944ad57" (UID: "c278290f-ff8f-407d-bdd9-78cbb944ad57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.442463 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdqbx\" (UniqueName: \"kubernetes.io/projected/c278290f-ff8f-407d-bdd9-78cbb944ad57-kube-api-access-zdqbx\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.442498 5200 reconciler_common.go:299] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-job-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.442506 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.442515 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c278290f-ff8f-407d-bdd9-78cbb944ad57-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.554359 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k77xb" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.554608 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k77xb" event={"ID":"c278290f-ff8f-407d-bdd9-78cbb944ad57","Type":"ContainerDied","Data":"fc9e3aa91d5bc592488db90d215ad2d50acda0975b9ad649263b5988a3b99817"} Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.554773 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9e3aa91d5bc592488db90d215ad2d50acda0975b9ad649263b5988a3b99817" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.853773 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.855530 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c278290f-ff8f-407d-bdd9-78cbb944ad57" containerName="manila-db-sync" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.855558 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c278290f-ff8f-407d-bdd9-78cbb944ad57" containerName="manila-db-sync" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.855892 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c278290f-ff8f-407d-bdd9-78cbb944ad57" containerName="manila-db-sync" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.863664 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.864204 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.874601 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-scheduler-config-data\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.875026 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-manila-dockercfg-drvzl\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.875420 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.876075 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-scripts\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.877439 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-config-data\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.877842 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-share-share1-config-data\"" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.890348 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.914234 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.953301 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c810050e-86da-4bea-8606-81b4a0c7d1a4-ceph\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.953723 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9762bc2-03ed-4011-b44e-817143b25ca0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.953772 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-scripts\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954032 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-config-data\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954076 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-config-data\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954113 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954159 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c810050e-86da-4bea-8606-81b4a0c7d1a4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954181 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954297 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-scripts\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954373 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954421 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c810050e-86da-4bea-8606-81b4a0c7d1a4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954449 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gttxt\" (UniqueName: \"kubernetes.io/projected/c810050e-86da-4bea-8606-81b4a0c7d1a4-kube-api-access-gttxt\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954524 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hmjj\" (UniqueName: \"kubernetes.io/projected/b9762bc2-03ed-4011-b44e-817143b25ca0-kube-api-access-8hmjj\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:52 crc kubenswrapper[5200]: I1126 15:15:52.954570 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.052837 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-678f5cc89-q5hvk"] Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058404 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-scripts\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058579 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-config-data\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058611 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-config-data\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058633 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058675 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c810050e-86da-4bea-8606-81b4a0c7d1a4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058706 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058755 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-scripts\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058808 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058837 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c810050e-86da-4bea-8606-81b4a0c7d1a4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058859 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gttxt\" (UniqueName: \"kubernetes.io/projected/c810050e-86da-4bea-8606-81b4a0c7d1a4-kube-api-access-gttxt\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058901 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hmjj\" (UniqueName: \"kubernetes.io/projected/b9762bc2-03ed-4011-b44e-817143b25ca0-kube-api-access-8hmjj\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058930 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.058979 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c810050e-86da-4bea-8606-81b4a0c7d1a4-ceph\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.059024 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9762bc2-03ed-4011-b44e-817143b25ca0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.059112 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9762bc2-03ed-4011-b44e-817143b25ca0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.070021 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-scripts\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.071291 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c810050e-86da-4bea-8606-81b4a0c7d1a4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.073291 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c810050e-86da-4bea-8606-81b4a0c7d1a4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.086672 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-678f5cc89-q5hvk"] Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.086872 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.102451 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.103093 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hmjj\" (UniqueName: \"kubernetes.io/projected/b9762bc2-03ed-4011-b44e-817143b25ca0-kube-api-access-8hmjj\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.103325 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-scripts\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.103322 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.110029 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-config-data\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.112206 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.118437 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gttxt\" (UniqueName: \"kubernetes.io/projected/c810050e-86da-4bea-8606-81b4a0c7d1a4-kube-api-access-gttxt\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.120223 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c810050e-86da-4bea-8606-81b4a0c7d1a4-config-data\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.131384 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c810050e-86da-4bea-8606-81b4a0c7d1a4-ceph\") pod \"manila-share-share1-0\" (UID: \"c810050e-86da-4bea-8606-81b4a0c7d1a4\") " pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.136884 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9762bc2-03ed-4011-b44e-817143b25ca0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b9762bc2-03ed-4011-b44e-817143b25ca0\") " pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.165453 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86v8\" (UniqueName: \"kubernetes.io/projected/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-kube-api-access-t86v8\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.165580 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-nb\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.165605 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-dns-svc\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.165676 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-sb\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.166142 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-config\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.220706 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.221417 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.238832 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.241436 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"manila-api-config-data\"" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.241887 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.242807 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.270873 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-config\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.271011 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t86v8\" (UniqueName: \"kubernetes.io/projected/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-kube-api-access-t86v8\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.271064 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-nb\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.271083 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-dns-svc\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.271126 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-sb\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.272403 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-config\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.272891 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-dns-svc\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.273513 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-sb\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.273648 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-nb\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.296614 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86v8\" (UniqueName: \"kubernetes.io/projected/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-kube-api-access-t86v8\") pod \"dnsmasq-dns-678f5cc89-q5hvk\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.373966 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50646301-6fce-4190-9a6b-0978c91d58ac-logs\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.374499 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-config-data-custom\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.374889 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-scripts\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.374995 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50646301-6fce-4190-9a6b-0978c91d58ac-etc-machine-id\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.375085 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.375139 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2jx\" (UniqueName: \"kubernetes.io/projected/50646301-6fce-4190-9a6b-0978c91d58ac-kube-api-access-rp2jx\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.376191 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-config-data\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.479420 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-config-data-custom\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.479495 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-scripts\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.479570 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50646301-6fce-4190-9a6b-0978c91d58ac-etc-machine-id\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.479611 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.479630 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2jx\" (UniqueName: \"kubernetes.io/projected/50646301-6fce-4190-9a6b-0978c91d58ac-kube-api-access-rp2jx\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.479848 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-config-data\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.479913 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50646301-6fce-4190-9a6b-0978c91d58ac-logs\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.487287 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50646301-6fce-4190-9a6b-0978c91d58ac-etc-machine-id\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.487750 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50646301-6fce-4190-9a6b-0978c91d58ac-logs\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.488312 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-config-data-custom\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.488313 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.488616 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-scripts\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.505560 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50646301-6fce-4190-9a6b-0978c91d58ac-config-data\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.507905 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2jx\" (UniqueName: \"kubernetes.io/projected/50646301-6fce-4190-9a6b-0978c91d58ac-kube-api-access-rp2jx\") pod \"manila-api-0\" (UID: \"50646301-6fce-4190-9a6b-0978c91d58ac\") " pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.579084 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.769202 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.945838 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 26 15:15:53 crc kubenswrapper[5200]: I1126 15:15:53.966047 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:15:54 crc kubenswrapper[5200]: I1126 15:15:54.153755 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 26 15:15:54 crc kubenswrapper[5200]: I1126 15:15:54.267837 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-678f5cc89-q5hvk"] Nov 26 15:15:54 crc kubenswrapper[5200]: I1126 15:15:54.516238 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 26 15:15:54 crc kubenswrapper[5200]: W1126 15:15:54.530668 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50646301_6fce_4190_9a6b_0978c91d58ac.slice/crio-e42904373240f7d59ed9c52ca3bce507d52b3af53e9a4af84125253802d85f35 WatchSource:0}: Error finding container e42904373240f7d59ed9c52ca3bce507d52b3af53e9a4af84125253802d85f35: Status 404 returned error can't find the container with id e42904373240f7d59ed9c52ca3bce507d52b3af53e9a4af84125253802d85f35 Nov 26 15:15:54 crc kubenswrapper[5200]: I1126 15:15:54.601987 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b9762bc2-03ed-4011-b44e-817143b25ca0","Type":"ContainerStarted","Data":"437645f39733ca91c97d36809fec9af872c848051f87b1cba60f4e367797f4c2"} Nov 26 15:15:54 crc kubenswrapper[5200]: I1126 15:15:54.605614 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" event={"ID":"8acc51b7-8a56-4e06-a7e4-df94f2d2a551","Type":"ContainerStarted","Data":"03a162a052e8450f373ff7f419ee81d7eb70c33bc69dc4d82c9355922252a873"} Nov 26 15:15:54 crc kubenswrapper[5200]: I1126 15:15:54.607603 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c810050e-86da-4bea-8606-81b4a0c7d1a4","Type":"ContainerStarted","Data":"bb6b9a7f8ca44dbffa2a24f53fdc44bd1ea7c8e4f25c7208394c5dd85b937373"} Nov 26 15:15:54 crc kubenswrapper[5200]: I1126 15:15:54.609486 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"50646301-6fce-4190-9a6b-0978c91d58ac","Type":"ContainerStarted","Data":"e42904373240f7d59ed9c52ca3bce507d52b3af53e9a4af84125253802d85f35"} Nov 26 15:15:55 crc kubenswrapper[5200]: I1126 15:15:55.622582 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" event={"ID":"8acc51b7-8a56-4e06-a7e4-df94f2d2a551","Type":"ContainerStarted","Data":"b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9"} Nov 26 15:15:55 crc kubenswrapper[5200]: I1126 15:15:55.627517 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"50646301-6fce-4190-9a6b-0978c91d58ac","Type":"ContainerStarted","Data":"07222a133679e9aa045df9467aaaa17d34a4ba10914384411157e5f361eebc46"} Nov 26 15:15:56 crc kubenswrapper[5200]: I1126 15:15:56.647427 5200 generic.go:358] "Generic (PLEG): container finished" podID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" containerID="b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9" exitCode=0 Nov 26 15:15:56 crc kubenswrapper[5200]: I1126 15:15:56.647962 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" event={"ID":"8acc51b7-8a56-4e06-a7e4-df94f2d2a551","Type":"ContainerDied","Data":"b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9"} Nov 26 15:15:57 crc kubenswrapper[5200]: I1126 15:15:57.659466 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b9762bc2-03ed-4011-b44e-817143b25ca0","Type":"ContainerStarted","Data":"452fee3a15f7e7a6e7275b5f1bfbb3bdd70728eaf3b2602556f5ee5f8b0ab30e"} Nov 26 15:15:57 crc kubenswrapper[5200]: I1126 15:15:57.660069 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b9762bc2-03ed-4011-b44e-817143b25ca0","Type":"ContainerStarted","Data":"1abea197613edb8b2a68a8f50eb90c85fe76c692bb6e8c18a2af2ce880b0cead"} Nov 26 15:15:57 crc kubenswrapper[5200]: I1126 15:15:57.661859 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" event={"ID":"8acc51b7-8a56-4e06-a7e4-df94f2d2a551","Type":"ContainerStarted","Data":"96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301"} Nov 26 15:15:57 crc kubenswrapper[5200]: I1126 15:15:57.661904 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:15:57 crc kubenswrapper[5200]: I1126 15:15:57.666320 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"50646301-6fce-4190-9a6b-0978c91d58ac","Type":"ContainerStarted","Data":"912ad6627dc856c6a09668ae781cc71ff571fa281920d37329328fc64205652a"} Nov 26 15:15:57 crc kubenswrapper[5200]: I1126 15:15:57.666390 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/manila-api-0" Nov 26 15:15:57 crc kubenswrapper[5200]: I1126 15:15:57.684243 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.567260093 podStartE2EDuration="5.684224534s" podCreationTimestamp="2025-11-26 15:15:52 +0000 UTC" firstStartedPulling="2025-11-26 15:15:53.966501651 +0000 UTC m=+6322.645703469" lastFinishedPulling="2025-11-26 15:15:56.083466092 +0000 UTC m=+6324.762667910" observedRunningTime="2025-11-26 15:15:57.679933539 +0000 UTC m=+6326.359135377" watchObservedRunningTime="2025-11-26 15:15:57.684224534 +0000 UTC m=+6326.363426352" Nov 26 15:15:57 crc kubenswrapper[5200]: I1126 15:15:57.714676 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.714644571 podStartE2EDuration="4.714644571s" podCreationTimestamp="2025-11-26 15:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:15:57.702002341 +0000 UTC m=+6326.381204169" watchObservedRunningTime="2025-11-26 15:15:57.714644571 +0000 UTC m=+6326.393846389" Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.338229 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" podStartSLOduration=5.338210524 podStartE2EDuration="5.338210524s" podCreationTimestamp="2025-11-26 15:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:15:57.721348971 +0000 UTC m=+6326.400550789" watchObservedRunningTime="2025-11-26 15:15:58.338210524 +0000 UTC m=+6327.017412332" Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.342889 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.343343 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="ceilometer-central-agent" containerID="cri-o://b641993913e79c393cfb1d33499961aa10388c951b5013382479dd8008a87f72" gracePeriod=30 Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.344268 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="proxy-httpd" containerID="cri-o://5d171398d3a557f4a4431ba2ad40fcb8024e812b94e45adc4e81ab8ddb1445c0" gracePeriod=30 Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.344312 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="ceilometer-notification-agent" containerID="cri-o://e3eed3351dd891170c5166e0d17162938684d0206cd43423dbfee3378d59cf9b" gracePeriod=30 Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.344369 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="sg-core" containerID="cri-o://cd79668a5757e2d24c79c3140cdab3b54b4e81d6aebdf618d24db3f6d30cf8ba" gracePeriod=30 Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.358407 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.690831 5200 generic.go:358] "Generic (PLEG): container finished" podID="889ff754-6fa4-4704-873f-f5606e3d8457" containerID="5d171398d3a557f4a4431ba2ad40fcb8024e812b94e45adc4e81ab8ddb1445c0" exitCode=0 Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.690886 5200 generic.go:358] "Generic (PLEG): container finished" podID="889ff754-6fa4-4704-873f-f5606e3d8457" containerID="cd79668a5757e2d24c79c3140cdab3b54b4e81d6aebdf618d24db3f6d30cf8ba" exitCode=2 Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.691470 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerDied","Data":"5d171398d3a557f4a4431ba2ad40fcb8024e812b94e45adc4e81ab8ddb1445c0"} Nov 26 15:15:58 crc kubenswrapper[5200]: I1126 15:15:58.691516 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerDied","Data":"cd79668a5757e2d24c79c3140cdab3b54b4e81d6aebdf618d24db3f6d30cf8ba"} Nov 26 15:15:59 crc kubenswrapper[5200]: I1126 15:15:59.706933 5200 generic.go:358] "Generic (PLEG): container finished" podID="889ff754-6fa4-4704-873f-f5606e3d8457" containerID="b641993913e79c393cfb1d33499961aa10388c951b5013382479dd8008a87f72" exitCode=0 Nov 26 15:15:59 crc kubenswrapper[5200]: I1126 15:15:59.706997 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerDied","Data":"b641993913e79c393cfb1d33499961aa10388c951b5013382479dd8008a87f72"} Nov 26 15:16:01 crc kubenswrapper[5200]: I1126 15:16:01.729530 5200 generic.go:358] "Generic (PLEG): container finished" podID="889ff754-6fa4-4704-873f-f5606e3d8457" containerID="e3eed3351dd891170c5166e0d17162938684d0206cd43423dbfee3378d59cf9b" exitCode=0 Nov 26 15:16:01 crc kubenswrapper[5200]: I1126 15:16:01.729660 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerDied","Data":"e3eed3351dd891170c5166e0d17162938684d0206cd43423dbfee3378d59cf9b"} Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.222030 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.368760 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.477189 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-scripts\") pod \"889ff754-6fa4-4704-873f-f5606e3d8457\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.477771 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-log-httpd\") pod \"889ff754-6fa4-4704-873f-f5606e3d8457\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.477840 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-sg-core-conf-yaml\") pod \"889ff754-6fa4-4704-873f-f5606e3d8457\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.477883 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-config-data\") pod \"889ff754-6fa4-4704-873f-f5606e3d8457\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.477952 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8xhc\" (UniqueName: \"kubernetes.io/projected/889ff754-6fa4-4704-873f-f5606e3d8457-kube-api-access-f8xhc\") pod \"889ff754-6fa4-4704-873f-f5606e3d8457\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.478003 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-run-httpd\") pod \"889ff754-6fa4-4704-873f-f5606e3d8457\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.478088 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-combined-ca-bundle\") pod \"889ff754-6fa4-4704-873f-f5606e3d8457\" (UID: \"889ff754-6fa4-4704-873f-f5606e3d8457\") " Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.478627 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "889ff754-6fa4-4704-873f-f5606e3d8457" (UID: "889ff754-6fa4-4704-873f-f5606e3d8457"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.478844 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "889ff754-6fa4-4704-873f-f5606e3d8457" (UID: "889ff754-6fa4-4704-873f-f5606e3d8457"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.479271 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.479314 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/889ff754-6fa4-4704-873f-f5606e3d8457-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.483496 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/889ff754-6fa4-4704-873f-f5606e3d8457-kube-api-access-f8xhc" (OuterVolumeSpecName: "kube-api-access-f8xhc") pod "889ff754-6fa4-4704-873f-f5606e3d8457" (UID: "889ff754-6fa4-4704-873f-f5606e3d8457"). InnerVolumeSpecName "kube-api-access-f8xhc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.487868 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-scripts" (OuterVolumeSpecName: "scripts") pod "889ff754-6fa4-4704-873f-f5606e3d8457" (UID: "889ff754-6fa4-4704-873f-f5606e3d8457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.513935 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "889ff754-6fa4-4704-873f-f5606e3d8457" (UID: "889ff754-6fa4-4704-873f-f5606e3d8457"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.568355 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "889ff754-6fa4-4704-873f-f5606e3d8457" (UID: "889ff754-6fa4-4704-873f-f5606e3d8457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.581596 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.581626 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.581635 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.581644 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8xhc\" (UniqueName: \"kubernetes.io/projected/889ff754-6fa4-4704-873f-f5606e3d8457-kube-api-access-f8xhc\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.601138 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-config-data" (OuterVolumeSpecName: "config-data") pod "889ff754-6fa4-4704-873f-f5606e3d8457" (UID: "889ff754-6fa4-4704-873f-f5606e3d8457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.683408 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889ff754-6fa4-4704-873f-f5606e3d8457-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.697843 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.762260 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd68489b9-gnwvm"] Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.762742 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" podUID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" containerName="dnsmasq-dns" containerID="cri-o://3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2" gracePeriod=10 Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.798808 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"889ff754-6fa4-4704-873f-f5606e3d8457","Type":"ContainerDied","Data":"77a0b0a3db804e1d45ec385430a59e4cef2e16fcd8f514d872e257296be5c55e"} Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.798858 5200 scope.go:117] "RemoveContainer" containerID="5d171398d3a557f4a4431ba2ad40fcb8024e812b94e45adc4e81ab8ddb1445c0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.799028 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.862012 5200 scope.go:117] "RemoveContainer" containerID="cd79668a5757e2d24c79c3140cdab3b54b4e81d6aebdf618d24db3f6d30cf8ba" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.881610 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.903615 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.918263 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929387 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="ceilometer-notification-agent" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929420 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="ceilometer-notification-agent" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929464 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="proxy-httpd" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929471 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="proxy-httpd" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929493 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="sg-core" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929499 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="sg-core" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929506 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="ceilometer-central-agent" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929512 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="ceilometer-central-agent" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929733 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="ceilometer-notification-agent" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929750 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="sg-core" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929763 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="ceilometer-central-agent" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.929780 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" containerName="proxy-httpd" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.948179 5200 scope.go:117] "RemoveContainer" containerID="e3eed3351dd891170c5166e0d17162938684d0206cd43423dbfee3378d59cf9b" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.959750 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.959930 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.962447 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.963407 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.990411 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.990898 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-run-httpd\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.990930 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-config-data\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.991032 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-scripts\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.991092 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctcmz\" (UniqueName: \"kubernetes.io/projected/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-kube-api-access-ctcmz\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.991197 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-log-httpd\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:03 crc kubenswrapper[5200]: I1126 15:16:03.991239 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.026330 5200 scope.go:117] "RemoveContainer" containerID="b641993913e79c393cfb1d33499961aa10388c951b5013382479dd8008a87f72" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.093722 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.093760 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-run-httpd\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.093784 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-config-data\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.094137 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-scripts\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.094299 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctcmz\" (UniqueName: \"kubernetes.io/projected/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-kube-api-access-ctcmz\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.094713 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-run-httpd\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.095026 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-log-httpd\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.095072 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.095893 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-log-httpd\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.112622 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-scripts\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.112780 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.113128 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.115338 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-config-data\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.121239 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctcmz\" (UniqueName: \"kubernetes.io/projected/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-kube-api-access-ctcmz\") pod \"ceilometer-0\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.293529 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.540884 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.610940 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-dns-svc\") pod \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.611520 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-config\") pod \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.611916 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-sb\") pod \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.611958 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-nb\") pod \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.612147 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfq7w\" (UniqueName: \"kubernetes.io/projected/5813b9c1-50b0-4a0c-90c3-fa2de1398326-kube-api-access-wfq7w\") pod \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\" (UID: \"5813b9c1-50b0-4a0c-90c3-fa2de1398326\") " Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.622105 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5813b9c1-50b0-4a0c-90c3-fa2de1398326-kube-api-access-wfq7w" (OuterVolumeSpecName: "kube-api-access-wfq7w") pod "5813b9c1-50b0-4a0c-90c3-fa2de1398326" (UID: "5813b9c1-50b0-4a0c-90c3-fa2de1398326"). InnerVolumeSpecName "kube-api-access-wfq7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.720015 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wfq7w\" (UniqueName: \"kubernetes.io/projected/5813b9c1-50b0-4a0c-90c3-fa2de1398326-kube-api-access-wfq7w\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.778828 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5813b9c1-50b0-4a0c-90c3-fa2de1398326" (UID: "5813b9c1-50b0-4a0c-90c3-fa2de1398326"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.824186 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.937075 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5813b9c1-50b0-4a0c-90c3-fa2de1398326" (UID: "5813b9c1-50b0-4a0c-90c3-fa2de1398326"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.937541 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.945977 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c810050e-86da-4bea-8606-81b4a0c7d1a4","Type":"ContainerStarted","Data":"80b49409fc1b6851bf3f4bf26af04891c6e3ccd08f43c44ce787fa1da31f5d44"} Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.949421 5200 generic.go:358] "Generic (PLEG): container finished" podID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" containerID="3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2" exitCode=0 Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.949593 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" event={"ID":"5813b9c1-50b0-4a0c-90c3-fa2de1398326","Type":"ContainerDied","Data":"3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2"} Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.952900 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" event={"ID":"5813b9c1-50b0-4a0c-90c3-fa2de1398326","Type":"ContainerDied","Data":"9a60f84e677758d575899b934c20bb95d8cf48e1fd945668f2cdbd7c8886ba10"} Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.952956 5200 scope.go:117] "RemoveContainer" containerID="3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.949643 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd68489b9-gnwvm" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.959350 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-config" (OuterVolumeSpecName: "config") pod "5813b9c1-50b0-4a0c-90c3-fa2de1398326" (UID: "5813b9c1-50b0-4a0c-90c3-fa2de1398326"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.962172 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.969183 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5813b9c1-50b0-4a0c-90c3-fa2de1398326" (UID: "5813b9c1-50b0-4a0c-90c3-fa2de1398326"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:16:04 crc kubenswrapper[5200]: I1126 15:16:04.999919 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="889ff754-6fa4-4704-873f-f5606e3d8457" path="/var/lib/kubelet/pods/889ff754-6fa4-4704-873f-f5606e3d8457/volumes" Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.029901 5200 scope.go:117] "RemoveContainer" containerID="8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9" Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.039804 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.039833 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5813b9c1-50b0-4a0c-90c3-fa2de1398326-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.060725 5200 scope.go:117] "RemoveContainer" containerID="3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2" Nov 26 15:16:05 crc kubenswrapper[5200]: E1126 15:16:05.062455 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2\": container with ID starting with 3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2 not found: ID does not exist" containerID="3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2" Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.062506 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2"} err="failed to get container status \"3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2\": rpc error: code = NotFound desc = could not find container \"3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2\": container with ID starting with 3c1fb68ed9db730451b7f15cd6cadfac34acc44d11f8bf0f71d6e985922e0aa2 not found: ID does not exist" Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.062536 5200 scope.go:117] "RemoveContainer" containerID="8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9" Nov 26 15:16:05 crc kubenswrapper[5200]: E1126 15:16:05.062826 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9\": container with ID starting with 8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9 not found: ID does not exist" containerID="8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9" Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.062856 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9"} err="failed to get container status \"8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9\": rpc error: code = NotFound desc = could not find container \"8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9\": container with ID starting with 8943f59cb2e62490135e0b62952b3a8f197a25a2347a3aeb6af333afbe6338a9 not found: ID does not exist" Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.281452 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd68489b9-gnwvm"] Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.290931 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd68489b9-gnwvm"] Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.969847 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c810050e-86da-4bea-8606-81b4a0c7d1a4","Type":"ContainerStarted","Data":"a3f226c9ae5b1913bb840366c78a400f106debe826bce3630d1e898718fac39f"} Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.974996 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerStarted","Data":"10a7b44d310c2b7a32b66bcd4169adc0e1eb06fa1ea44d3f20285c0aed83b050"} Nov 26 15:16:05 crc kubenswrapper[5200]: I1126 15:16:05.995003 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.938301656 podStartE2EDuration="13.994979112s" podCreationTimestamp="2025-11-26 15:15:52 +0000 UTC" firstStartedPulling="2025-11-26 15:15:54.160751737 +0000 UTC m=+6322.839953555" lastFinishedPulling="2025-11-26 15:16:03.217429183 +0000 UTC m=+6331.896631011" observedRunningTime="2025-11-26 15:16:05.988441066 +0000 UTC m=+6334.667642894" watchObservedRunningTime="2025-11-26 15:16:05.994979112 +0000 UTC m=+6334.674180940" Nov 26 15:16:06 crc kubenswrapper[5200]: I1126 15:16:06.618916 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:06 crc kubenswrapper[5200]: I1126 15:16:06.982793 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" path="/var/lib/kubelet/pods/5813b9c1-50b0-4a0c-90c3-fa2de1398326/volumes" Nov 26 15:16:06 crc kubenswrapper[5200]: I1126 15:16:06.992152 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerStarted","Data":"bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b"} Nov 26 15:16:08 crc kubenswrapper[5200]: I1126 15:16:08.003151 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerStarted","Data":"3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c"} Nov 26 15:16:09 crc kubenswrapper[5200]: I1126 15:16:09.030218 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerStarted","Data":"efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5"} Nov 26 15:16:10 crc kubenswrapper[5200]: I1126 15:16:10.043731 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerStarted","Data":"6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138"} Nov 26 15:16:10 crc kubenswrapper[5200]: I1126 15:16:10.044065 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 15:16:10 crc kubenswrapper[5200]: I1126 15:16:10.043866 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="ceilometer-central-agent" containerID="cri-o://bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b" gracePeriod=30 Nov 26 15:16:10 crc kubenswrapper[5200]: I1126 15:16:10.044115 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="proxy-httpd" containerID="cri-o://6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138" gracePeriod=30 Nov 26 15:16:10 crc kubenswrapper[5200]: I1126 15:16:10.044204 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="ceilometer-notification-agent" containerID="cri-o://3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c" gracePeriod=30 Nov 26 15:16:10 crc kubenswrapper[5200]: I1126 15:16:10.044239 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="sg-core" containerID="cri-o://efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5" gracePeriod=30 Nov 26 15:16:10 crc kubenswrapper[5200]: I1126 15:16:10.080906 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.610115268 podStartE2EDuration="7.080880601s" podCreationTimestamp="2025-11-26 15:16:03 +0000 UTC" firstStartedPulling="2025-11-26 15:16:04.965283394 +0000 UTC m=+6333.644485212" lastFinishedPulling="2025-11-26 15:16:09.436048727 +0000 UTC m=+6338.115250545" observedRunningTime="2025-11-26 15:16:10.069905556 +0000 UTC m=+6338.749107384" watchObservedRunningTime="2025-11-26 15:16:10.080880601 +0000 UTC m=+6338.760082439" Nov 26 15:16:10 crc kubenswrapper[5200]: I1126 15:16:10.332259 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.057933 5200 generic.go:358] "Generic (PLEG): container finished" podID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerID="6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138" exitCode=0 Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.058323 5200 generic.go:358] "Generic (PLEG): container finished" podID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerID="efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5" exitCode=2 Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.058337 5200 generic.go:358] "Generic (PLEG): container finished" podID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerID="3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c" exitCode=0 Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.058144 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerDied","Data":"6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138"} Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.058482 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerDied","Data":"efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5"} Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.058527 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerDied","Data":"3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c"} Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.690808 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.711282 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-sg-core-conf-yaml\") pod \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.711338 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-config-data\") pod \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.711488 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctcmz\" (UniqueName: \"kubernetes.io/projected/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-kube-api-access-ctcmz\") pod \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.711549 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-scripts\") pod \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.711752 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-log-httpd\") pod \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.711875 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-combined-ca-bundle\") pod \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.711983 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-run-httpd\") pod \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\" (UID: \"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b\") " Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.712858 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" (UID: "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.713814 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" (UID: "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.719408 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-scripts" (OuterVolumeSpecName: "scripts") pod "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" (UID: "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.720131 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-kube-api-access-ctcmz" (OuterVolumeSpecName: "kube-api-access-ctcmz") pod "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" (UID: "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b"). InnerVolumeSpecName "kube-api-access-ctcmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.761739 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" (UID: "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.797063 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" (UID: "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.814997 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.815026 5200 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.815035 5200 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.815043 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctcmz\" (UniqueName: \"kubernetes.io/projected/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-kube-api-access-ctcmz\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.815054 5200 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.815062 5200 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.849936 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-config-data" (OuterVolumeSpecName: "config-data") pod "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" (UID: "cdbdbb46-6357-45f5-89b7-fc0f78a8be5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:16:11 crc kubenswrapper[5200]: I1126 15:16:11.917488 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.072259 5200 generic.go:358] "Generic (PLEG): container finished" podID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerID="bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b" exitCode=0 Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.072314 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerDied","Data":"bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b"} Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.072358 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.072386 5200 scope.go:117] "RemoveContainer" containerID="6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.072369 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdbdbb46-6357-45f5-89b7-fc0f78a8be5b","Type":"ContainerDied","Data":"10a7b44d310c2b7a32b66bcd4169adc0e1eb06fa1ea44d3f20285c0aed83b050"} Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.094176 5200 scope.go:117] "RemoveContainer" containerID="efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.114415 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.123871 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.137919 5200 scope.go:117] "RemoveContainer" containerID="3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.142584 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.143896 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="proxy-httpd" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.143918 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="proxy-httpd" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.143934 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="sg-core" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.143939 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="sg-core" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.143951 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="ceilometer-notification-agent" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.143959 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="ceilometer-notification-agent" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.143982 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" containerName="init" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.143987 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" containerName="init" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144014 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="ceilometer-central-agent" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144020 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="ceilometer-central-agent" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144030 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" containerName="dnsmasq-dns" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144036 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" containerName="dnsmasq-dns" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144224 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5813b9c1-50b0-4a0c-90c3-fa2de1398326" containerName="dnsmasq-dns" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144235 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="ceilometer-notification-agent" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144249 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="ceilometer-central-agent" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144261 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="proxy-httpd" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.144271 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" containerName="sg-core" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.153925 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.156623 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.156948 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.163826 5200 scope.go:117] "RemoveContainer" containerID="bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.177727 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.203535 5200 scope.go:117] "RemoveContainer" containerID="6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138" Nov 26 15:16:12 crc kubenswrapper[5200]: E1126 15:16:12.203926 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138\": container with ID starting with 6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138 not found: ID does not exist" containerID="6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.203977 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138"} err="failed to get container status \"6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138\": rpc error: code = NotFound desc = could not find container \"6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138\": container with ID starting with 6308eef47c630b1671b04fcffcd848ff6eec05d28ed49f2745c415a236934138 not found: ID does not exist" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.204011 5200 scope.go:117] "RemoveContainer" containerID="efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5" Nov 26 15:16:12 crc kubenswrapper[5200]: E1126 15:16:12.204380 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5\": container with ID starting with efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5 not found: ID does not exist" containerID="efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.204424 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5"} err="failed to get container status \"efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5\": rpc error: code = NotFound desc = could not find container \"efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5\": container with ID starting with efa5eb9d79583bd2c723137d1e5ce1d45eb12be550da3971f451c52966e52db5 not found: ID does not exist" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.204449 5200 scope.go:117] "RemoveContainer" containerID="3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c" Nov 26 15:16:12 crc kubenswrapper[5200]: E1126 15:16:12.204837 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c\": container with ID starting with 3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c not found: ID does not exist" containerID="3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.204869 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c"} err="failed to get container status \"3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c\": rpc error: code = NotFound desc = could not find container \"3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c\": container with ID starting with 3043d6ee6d484ac93a6269e43978c1c4dd5aa78acb62da9dffc3f98d46cf8d2c not found: ID does not exist" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.204890 5200 scope.go:117] "RemoveContainer" containerID="bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b" Nov 26 15:16:12 crc kubenswrapper[5200]: E1126 15:16:12.205170 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b\": container with ID starting with bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b not found: ID does not exist" containerID="bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.205192 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b"} err="failed to get container status \"bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b\": rpc error: code = NotFound desc = could not find container \"bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b\": container with ID starting with bb9fee68c577db21325f394cb8c77d4d9b5c4c6f668535e8e12cf58893c1268b not found: ID does not exist" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.225315 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f4347d-c34a-497d-a741-62d81b71a86d-log-httpd\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.225371 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f4347d-c34a-497d-a741-62d81b71a86d-run-httpd\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.225516 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-config-data\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.225610 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.225864 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-scripts\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.226015 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.226157 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z4pw\" (UniqueName: \"kubernetes.io/projected/b7f4347d-c34a-497d-a741-62d81b71a86d-kube-api-access-7z4pw\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.330676 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f4347d-c34a-497d-a741-62d81b71a86d-log-httpd\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.331516 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f4347d-c34a-497d-a741-62d81b71a86d-run-httpd\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.331534 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f4347d-c34a-497d-a741-62d81b71a86d-log-httpd\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.332080 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-config-data\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.332313 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.332506 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-scripts\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.332613 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.332778 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z4pw\" (UniqueName: \"kubernetes.io/projected/b7f4347d-c34a-497d-a741-62d81b71a86d-kube-api-access-7z4pw\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.332145 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f4347d-c34a-497d-a741-62d81b71a86d-run-httpd\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.337603 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-scripts\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.338259 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.338630 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-config-data\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.355923 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f4347d-c34a-497d-a741-62d81b71a86d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:12 crc kubenswrapper[5200]: I1126 15:16:12.358130 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z4pw\" (UniqueName: \"kubernetes.io/projected/b7f4347d-c34a-497d-a741-62d81b71a86d-kube-api-access-7z4pw\") pod \"ceilometer-0\" (UID: \"b7f4347d-c34a-497d-a741-62d81b71a86d\") " pod="openstack/ceilometer-0" Nov 26 15:16:13 crc kubenswrapper[5200]: I1126 15:16:13.073214 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:16:13 crc kubenswrapper[5200]: I1126 15:16:13.111591 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdbdbb46-6357-45f5-89b7-fc0f78a8be5b" path="/var/lib/kubelet/pods/cdbdbb46-6357-45f5-89b7-fc0f78a8be5b/volumes" Nov 26 15:16:13 crc kubenswrapper[5200]: I1126 15:16:13.243597 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 26 15:16:13 crc kubenswrapper[5200]: I1126 15:16:13.573636 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:16:13 crc kubenswrapper[5200]: W1126 15:16:13.579005 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7f4347d_c34a_497d_a741_62d81b71a86d.slice/crio-e97bf66e7eabd156024aaac37e41ebd85bd374c721ec76ff404718b20891c57a WatchSource:0}: Error finding container e97bf66e7eabd156024aaac37e41ebd85bd374c721ec76ff404718b20891c57a: Status 404 returned error can't find the container with id e97bf66e7eabd156024aaac37e41ebd85bd374c721ec76ff404718b20891c57a Nov 26 15:16:14 crc kubenswrapper[5200]: I1126 15:16:14.124495 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f4347d-c34a-497d-a741-62d81b71a86d","Type":"ContainerStarted","Data":"e97bf66e7eabd156024aaac37e41ebd85bd374c721ec76ff404718b20891c57a"} Nov 26 15:16:14 crc kubenswrapper[5200]: I1126 15:16:14.680708 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 26 15:16:14 crc kubenswrapper[5200]: I1126 15:16:14.896388 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 26 15:16:15 crc kubenswrapper[5200]: I1126 15:16:15.135532 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f4347d-c34a-497d-a741-62d81b71a86d","Type":"ContainerStarted","Data":"55751e4f5b8e22d36e69a49164c5ecb012c6dad791103412d8b03155cfdcd65e"} Nov 26 15:16:17 crc kubenswrapper[5200]: I1126 15:16:17.156709 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f4347d-c34a-497d-a741-62d81b71a86d","Type":"ContainerStarted","Data":"a3c8120382cbc59d5ca1ab88aa40fd663c6db3248b471b7eab05d2e74d3fcc00"} Nov 26 15:16:17 crc kubenswrapper[5200]: I1126 15:16:17.157290 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f4347d-c34a-497d-a741-62d81b71a86d","Type":"ContainerStarted","Data":"59e26b34741f186408f3f82d85fec49794d3022c35b3156feae51aac3c067f4a"} Nov 26 15:16:19 crc kubenswrapper[5200]: I1126 15:16:19.181607 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f4347d-c34a-497d-a741-62d81b71a86d","Type":"ContainerStarted","Data":"b86ad40ac38462f7eabe69e75821886dc52b537bdc645c428b7c718ed342237f"} Nov 26 15:16:19 crc kubenswrapper[5200]: I1126 15:16:19.182206 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Nov 26 15:16:19 crc kubenswrapper[5200]: I1126 15:16:19.207158 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.406038734 podStartE2EDuration="7.207139566s" podCreationTimestamp="2025-11-26 15:16:12 +0000 UTC" firstStartedPulling="2025-11-26 15:16:13.581051892 +0000 UTC m=+6342.260253710" lastFinishedPulling="2025-11-26 15:16:18.382152724 +0000 UTC m=+6347.061354542" observedRunningTime="2025-11-26 15:16:19.197386704 +0000 UTC m=+6347.876588532" watchObservedRunningTime="2025-11-26 15:16:19.207139566 +0000 UTC m=+6347.886341394" Nov 26 15:16:32 crc kubenswrapper[5200]: I1126 15:16:32.864339 5200 scope.go:117] "RemoveContainer" containerID="893b8e4b44107b00eb1b9da82b94d9308f2b151c830699345b52bf50a0e6456a" Nov 26 15:16:32 crc kubenswrapper[5200]: I1126 15:16:32.888013 5200 scope.go:117] "RemoveContainer" containerID="bbafab13bed42c22a0279322a582fc84f999c3b861830aff723737c281c2aab7" Nov 26 15:16:32 crc kubenswrapper[5200]: I1126 15:16:32.998300 5200 scope.go:117] "RemoveContainer" containerID="3e60d37e8e4e2bb5fdc26907039428d4fa5cc5b7b8b3192f35851f5a7542b2d4" Nov 26 15:16:42 crc kubenswrapper[5200]: E1126 15:16:42.666594 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2433864 actualBytes=10240 Nov 26 15:16:50 crc kubenswrapper[5200]: I1126 15:16:50.201651 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 15:17:03 crc kubenswrapper[5200]: I1126 15:17:03.154477 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:17:03 crc kubenswrapper[5200]: I1126 15:17:03.155046 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.671939 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65945dc6f9-r4l8s"] Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.685476 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.691134 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-cell1\"" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.717298 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65945dc6f9-r4l8s"] Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.815034 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-sb\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.815462 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-config\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.815570 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8944\" (UniqueName: \"kubernetes.io/projected/d55846ab-4971-47b4-a08c-57d019d52565-kube-api-access-h8944\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.815748 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-dns-svc\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.815849 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-openstack-cell1\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.816002 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-nb\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.917729 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-sb\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.917783 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-config\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.917812 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8944\" (UniqueName: \"kubernetes.io/projected/d55846ab-4971-47b4-a08c-57d019d52565-kube-api-access-h8944\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.917896 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-dns-svc\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.917912 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-openstack-cell1\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.917985 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-nb\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.919443 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-nb\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.919605 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-sb\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.919983 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-config\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.920006 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-dns-svc\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.920232 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-openstack-cell1\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:04 crc kubenswrapper[5200]: I1126 15:17:04.937831 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8944\" (UniqueName: \"kubernetes.io/projected/d55846ab-4971-47b4-a08c-57d019d52565-kube-api-access-h8944\") pod \"dnsmasq-dns-65945dc6f9-r4l8s\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:05 crc kubenswrapper[5200]: I1126 15:17:05.026229 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:05 crc kubenswrapper[5200]: I1126 15:17:05.544499 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65945dc6f9-r4l8s"] Nov 26 15:17:05 crc kubenswrapper[5200]: I1126 15:17:05.695503 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" event={"ID":"d55846ab-4971-47b4-a08c-57d019d52565","Type":"ContainerStarted","Data":"101ce92a28939bd517d58baafbb69f6948622b368a2c8ae169994c8052eead23"} Nov 26 15:17:05 crc kubenswrapper[5200]: I1126 15:17:05.937386 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kv9df"] Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.143012 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kv9df"] Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.143150 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.251396 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp6zz\" (UniqueName: \"kubernetes.io/projected/12b0eb92-bd49-4f77-bab3-aca9705a0845-kube-api-access-cp6zz\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.251757 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-catalog-content\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.251838 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-utilities\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.355022 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp6zz\" (UniqueName: \"kubernetes.io/projected/12b0eb92-bd49-4f77-bab3-aca9705a0845-kube-api-access-cp6zz\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.355130 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-catalog-content\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.355165 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-utilities\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.355645 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-catalog-content\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.355830 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-utilities\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.376107 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp6zz\" (UniqueName: \"kubernetes.io/projected/12b0eb92-bd49-4f77-bab3-aca9705a0845-kube-api-access-cp6zz\") pod \"certified-operators-kv9df\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.473640 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.708220 5200 generic.go:358] "Generic (PLEG): container finished" podID="d55846ab-4971-47b4-a08c-57d019d52565" containerID="d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473" exitCode=0 Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.708277 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" event={"ID":"d55846ab-4971-47b4-a08c-57d019d52565","Type":"ContainerDied","Data":"d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473"} Nov 26 15:17:06 crc kubenswrapper[5200]: W1126 15:17:06.959121 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b0eb92_bd49_4f77_bab3_aca9705a0845.slice/crio-380a77a07f5233ba9621c3afb84423ca8f1c3d5b6164f0f41841c10bc05e7a0b WatchSource:0}: Error finding container 380a77a07f5233ba9621c3afb84423ca8f1c3d5b6164f0f41841c10bc05e7a0b: Status 404 returned error can't find the container with id 380a77a07f5233ba9621c3afb84423ca8f1c3d5b6164f0f41841c10bc05e7a0b Nov 26 15:17:06 crc kubenswrapper[5200]: I1126 15:17:06.964675 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kv9df"] Nov 26 15:17:07 crc kubenswrapper[5200]: I1126 15:17:07.728353 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" event={"ID":"d55846ab-4971-47b4-a08c-57d019d52565","Type":"ContainerStarted","Data":"ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c"} Nov 26 15:17:07 crc kubenswrapper[5200]: I1126 15:17:07.728707 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:07 crc kubenswrapper[5200]: I1126 15:17:07.730845 5200 generic.go:358] "Generic (PLEG): container finished" podID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerID="bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b" exitCode=0 Nov 26 15:17:07 crc kubenswrapper[5200]: I1126 15:17:07.731327 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9df" event={"ID":"12b0eb92-bd49-4f77-bab3-aca9705a0845","Type":"ContainerDied","Data":"bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b"} Nov 26 15:17:07 crc kubenswrapper[5200]: I1126 15:17:07.731376 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9df" event={"ID":"12b0eb92-bd49-4f77-bab3-aca9705a0845","Type":"ContainerStarted","Data":"380a77a07f5233ba9621c3afb84423ca8f1c3d5b6164f0f41841c10bc05e7a0b"} Nov 26 15:17:07 crc kubenswrapper[5200]: I1126 15:17:07.765155 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" podStartSLOduration=3.765120845 podStartE2EDuration="3.765120845s" podCreationTimestamp="2025-11-26 15:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:17:07.752041144 +0000 UTC m=+6396.431242982" watchObservedRunningTime="2025-11-26 15:17:07.765120845 +0000 UTC m=+6396.444322663" Nov 26 15:17:09 crc kubenswrapper[5200]: I1126 15:17:09.757804 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9df" event={"ID":"12b0eb92-bd49-4f77-bab3-aca9705a0845","Type":"ContainerStarted","Data":"a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f"} Nov 26 15:17:10 crc kubenswrapper[5200]: I1126 15:17:10.771051 5200 generic.go:358] "Generic (PLEG): container finished" podID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerID="a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f" exitCode=0 Nov 26 15:17:10 crc kubenswrapper[5200]: I1126 15:17:10.771441 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9df" event={"ID":"12b0eb92-bd49-4f77-bab3-aca9705a0845","Type":"ContainerDied","Data":"a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f"} Nov 26 15:17:11 crc kubenswrapper[5200]: I1126 15:17:11.792235 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9df" event={"ID":"12b0eb92-bd49-4f77-bab3-aca9705a0845","Type":"ContainerStarted","Data":"650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470"} Nov 26 15:17:11 crc kubenswrapper[5200]: I1126 15:17:11.828090 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kv9df" podStartSLOduration=5.649954844 podStartE2EDuration="6.828061537s" podCreationTimestamp="2025-11-26 15:17:05 +0000 UTC" firstStartedPulling="2025-11-26 15:17:07.733077255 +0000 UTC m=+6396.412279073" lastFinishedPulling="2025-11-26 15:17:08.911183948 +0000 UTC m=+6397.590385766" observedRunningTime="2025-11-26 15:17:11.8106611 +0000 UTC m=+6400.489862938" watchObservedRunningTime="2025-11-26 15:17:11.828061537 +0000 UTC m=+6400.507263355" Nov 26 15:17:13 crc kubenswrapper[5200]: I1126 15:17:13.747835 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:13 crc kubenswrapper[5200]: I1126 15:17:13.816483 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-678f5cc89-q5hvk"] Nov 26 15:17:13 crc kubenswrapper[5200]: I1126 15:17:13.816737 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" podUID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" containerName="dnsmasq-dns" containerID="cri-o://96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301" gracePeriod=10 Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.076910 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dcc4787f-bkzrl"] Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.083671 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.120808 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dcc4787f-bkzrl"] Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.148636 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-dns-svc\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.150854 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-ovsdbserver-sb\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.151052 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l49bn\" (UniqueName: \"kubernetes.io/projected/82d441b1-167f-4309-92c2-36abaca7f602-kube-api-access-l49bn\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.151278 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-config\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.151435 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-ovsdbserver-nb\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.151784 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-openstack-cell1\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.254945 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-dns-svc\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.255092 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-ovsdbserver-sb\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.255121 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l49bn\" (UniqueName: \"kubernetes.io/projected/82d441b1-167f-4309-92c2-36abaca7f602-kube-api-access-l49bn\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.255220 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-config\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.255259 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-ovsdbserver-nb\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.255338 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-openstack-cell1\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.256923 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-config\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.257554 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-openstack-cell1\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.258286 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-dns-svc\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.258453 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-ovsdbserver-sb\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.258490 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82d441b1-167f-4309-92c2-36abaca7f602-ovsdbserver-nb\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.298279 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l49bn\" (UniqueName: \"kubernetes.io/projected/82d441b1-167f-4309-92c2-36abaca7f602-kube-api-access-l49bn\") pod \"dnsmasq-dns-74dcc4787f-bkzrl\" (UID: \"82d441b1-167f-4309-92c2-36abaca7f602\") " pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.412190 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.577426 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.666712 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-nb\") pod \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.667191 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-dns-svc\") pod \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.667296 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-config\") pod \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.667421 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-sb\") pod \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.667456 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t86v8\" (UniqueName: \"kubernetes.io/projected/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-kube-api-access-t86v8\") pod \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\" (UID: \"8acc51b7-8a56-4e06-a7e4-df94f2d2a551\") " Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.679945 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-kube-api-access-t86v8" (OuterVolumeSpecName: "kube-api-access-t86v8") pod "8acc51b7-8a56-4e06-a7e4-df94f2d2a551" (UID: "8acc51b7-8a56-4e06-a7e4-df94f2d2a551"). InnerVolumeSpecName "kube-api-access-t86v8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.738570 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8acc51b7-8a56-4e06-a7e4-df94f2d2a551" (UID: "8acc51b7-8a56-4e06-a7e4-df94f2d2a551"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.761176 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8acc51b7-8a56-4e06-a7e4-df94f2d2a551" (UID: "8acc51b7-8a56-4e06-a7e4-df94f2d2a551"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.767543 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8acc51b7-8a56-4e06-a7e4-df94f2d2a551" (UID: "8acc51b7-8a56-4e06-a7e4-df94f2d2a551"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.771537 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t86v8\" (UniqueName: \"kubernetes.io/projected/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-kube-api-access-t86v8\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.771768 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.771780 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.771788 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.785208 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-config" (OuterVolumeSpecName: "config") pod "8acc51b7-8a56-4e06-a7e4-df94f2d2a551" (UID: "8acc51b7-8a56-4e06-a7e4-df94f2d2a551"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.833215 5200 generic.go:358] "Generic (PLEG): container finished" podID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" containerID="96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301" exitCode=0 Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.833461 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.833540 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" event={"ID":"8acc51b7-8a56-4e06-a7e4-df94f2d2a551","Type":"ContainerDied","Data":"96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301"} Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.833572 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f5cc89-q5hvk" event={"ID":"8acc51b7-8a56-4e06-a7e4-df94f2d2a551","Type":"ContainerDied","Data":"03a162a052e8450f373ff7f419ee81d7eb70c33bc69dc4d82c9355922252a873"} Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.833589 5200 scope.go:117] "RemoveContainer" containerID="96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.877306 5200 scope.go:117] "RemoveContainer" containerID="b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.878071 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8acc51b7-8a56-4e06-a7e4-df94f2d2a551-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.883678 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-678f5cc89-q5hvk"] Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.900255 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-678f5cc89-q5hvk"] Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.909022 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dcc4787f-bkzrl"] Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.911625 5200 scope.go:117] "RemoveContainer" containerID="96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301" Nov 26 15:17:14 crc kubenswrapper[5200]: E1126 15:17:14.912056 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301\": container with ID starting with 96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301 not found: ID does not exist" containerID="96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.912089 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301"} err="failed to get container status \"96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301\": rpc error: code = NotFound desc = could not find container \"96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301\": container with ID starting with 96932ddebb5e797b98d763c31438760e70da876ed757a396b58bd01b32906301 not found: ID does not exist" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.912112 5200 scope.go:117] "RemoveContainer" containerID="b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9" Nov 26 15:17:14 crc kubenswrapper[5200]: E1126 15:17:14.912840 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9\": container with ID starting with b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9 not found: ID does not exist" containerID="b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9" Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.912873 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9"} err="failed to get container status \"b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9\": rpc error: code = NotFound desc = could not find container \"b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9\": container with ID starting with b7eeb82da3d747dca7021371f83705da522474376a0c2e9cab56fb6b8634d0d9 not found: ID does not exist" Nov 26 15:17:14 crc kubenswrapper[5200]: W1126 15:17:14.918249 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82d441b1_167f_4309_92c2_36abaca7f602.slice/crio-a361cfb4454162cd5be6f26c7ee9dcccf6c0162e98630c6d1556c268a7b24420 WatchSource:0}: Error finding container a361cfb4454162cd5be6f26c7ee9dcccf6c0162e98630c6d1556c268a7b24420: Status 404 returned error can't find the container with id a361cfb4454162cd5be6f26c7ee9dcccf6c0162e98630c6d1556c268a7b24420 Nov 26 15:17:14 crc kubenswrapper[5200]: I1126 15:17:14.988960 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" path="/var/lib/kubelet/pods/8acc51b7-8a56-4e06-a7e4-df94f2d2a551/volumes" Nov 26 15:17:15 crc kubenswrapper[5200]: I1126 15:17:15.846612 5200 generic.go:358] "Generic (PLEG): container finished" podID="82d441b1-167f-4309-92c2-36abaca7f602" containerID="3f7ecd14f4056a594cc612cd77338d053779a6799c8adf6525dde73cca2fb1d1" exitCode=0 Nov 26 15:17:15 crc kubenswrapper[5200]: I1126 15:17:15.846661 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" event={"ID":"82d441b1-167f-4309-92c2-36abaca7f602","Type":"ContainerDied","Data":"3f7ecd14f4056a594cc612cd77338d053779a6799c8adf6525dde73cca2fb1d1"} Nov 26 15:17:15 crc kubenswrapper[5200]: I1126 15:17:15.847060 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" event={"ID":"82d441b1-167f-4309-92c2-36abaca7f602","Type":"ContainerStarted","Data":"a361cfb4454162cd5be6f26c7ee9dcccf6c0162e98630c6d1556c268a7b24420"} Nov 26 15:17:16 crc kubenswrapper[5200]: I1126 15:17:16.474831 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:16 crc kubenswrapper[5200]: I1126 15:17:16.475224 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:16 crc kubenswrapper[5200]: I1126 15:17:16.523178 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:16 crc kubenswrapper[5200]: I1126 15:17:16.858784 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" event={"ID":"82d441b1-167f-4309-92c2-36abaca7f602","Type":"ContainerStarted","Data":"96fb7752cdcee27d9d003f2224d6ba223d7ff56caedfaf8816ee6cc89b448830"} Nov 26 15:17:16 crc kubenswrapper[5200]: I1126 15:17:16.860047 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:16 crc kubenswrapper[5200]: I1126 15:17:16.881447 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" podStartSLOduration=2.8814066240000002 podStartE2EDuration="2.881406624s" podCreationTimestamp="2025-11-26 15:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:17:16.879255056 +0000 UTC m=+6405.558456874" watchObservedRunningTime="2025-11-26 15:17:16.881406624 +0000 UTC m=+6405.560608452" Nov 26 15:17:16 crc kubenswrapper[5200]: I1126 15:17:16.921103 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:16 crc kubenswrapper[5200]: I1126 15:17:16.980509 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kv9df"] Nov 26 15:17:18 crc kubenswrapper[5200]: I1126 15:17:18.885357 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kv9df" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerName="registry-server" containerID="cri-o://650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470" gracePeriod=2 Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.369327 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.387751 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp6zz\" (UniqueName: \"kubernetes.io/projected/12b0eb92-bd49-4f77-bab3-aca9705a0845-kube-api-access-cp6zz\") pod \"12b0eb92-bd49-4f77-bab3-aca9705a0845\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.387809 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-catalog-content\") pod \"12b0eb92-bd49-4f77-bab3-aca9705a0845\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.388166 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-utilities\") pod \"12b0eb92-bd49-4f77-bab3-aca9705a0845\" (UID: \"12b0eb92-bd49-4f77-bab3-aca9705a0845\") " Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.389526 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-utilities" (OuterVolumeSpecName: "utilities") pod "12b0eb92-bd49-4f77-bab3-aca9705a0845" (UID: "12b0eb92-bd49-4f77-bab3-aca9705a0845"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.399870 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b0eb92-bd49-4f77-bab3-aca9705a0845-kube-api-access-cp6zz" (OuterVolumeSpecName: "kube-api-access-cp6zz") pod "12b0eb92-bd49-4f77-bab3-aca9705a0845" (UID: "12b0eb92-bd49-4f77-bab3-aca9705a0845"). InnerVolumeSpecName "kube-api-access-cp6zz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.439238 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12b0eb92-bd49-4f77-bab3-aca9705a0845" (UID: "12b0eb92-bd49-4f77-bab3-aca9705a0845"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.490125 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.490160 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cp6zz\" (UniqueName: \"kubernetes.io/projected/12b0eb92-bd49-4f77-bab3-aca9705a0845-kube-api-access-cp6zz\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.490169 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b0eb92-bd49-4f77-bab3-aca9705a0845-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.921867 5200 generic.go:358] "Generic (PLEG): container finished" podID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerID="650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470" exitCode=0 Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.922044 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv9df" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.922041 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9df" event={"ID":"12b0eb92-bd49-4f77-bab3-aca9705a0845","Type":"ContainerDied","Data":"650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470"} Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.922394 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv9df" event={"ID":"12b0eb92-bd49-4f77-bab3-aca9705a0845","Type":"ContainerDied","Data":"380a77a07f5233ba9621c3afb84423ca8f1c3d5b6164f0f41841c10bc05e7a0b"} Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.922428 5200 scope.go:117] "RemoveContainer" containerID="650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.955123 5200 scope.go:117] "RemoveContainer" containerID="a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.971213 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kv9df"] Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.978905 5200 scope.go:117] "RemoveContainer" containerID="bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b" Nov 26 15:17:19 crc kubenswrapper[5200]: I1126 15:17:19.980639 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kv9df"] Nov 26 15:17:20 crc kubenswrapper[5200]: I1126 15:17:20.023954 5200 scope.go:117] "RemoveContainer" containerID="650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470" Nov 26 15:17:20 crc kubenswrapper[5200]: E1126 15:17:20.024558 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470\": container with ID starting with 650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470 not found: ID does not exist" containerID="650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470" Nov 26 15:17:20 crc kubenswrapper[5200]: I1126 15:17:20.024607 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470"} err="failed to get container status \"650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470\": rpc error: code = NotFound desc = could not find container \"650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470\": container with ID starting with 650b735a3df492fbf00357c8fb87d3dc05dcfaf9c1394f8daa3638632d721470 not found: ID does not exist" Nov 26 15:17:20 crc kubenswrapper[5200]: I1126 15:17:20.024637 5200 scope.go:117] "RemoveContainer" containerID="a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f" Nov 26 15:17:20 crc kubenswrapper[5200]: E1126 15:17:20.024939 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f\": container with ID starting with a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f not found: ID does not exist" containerID="a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f" Nov 26 15:17:20 crc kubenswrapper[5200]: I1126 15:17:20.024963 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f"} err="failed to get container status \"a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f\": rpc error: code = NotFound desc = could not find container \"a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f\": container with ID starting with a22087bbcd66d67ab21bb9c2611240105ee919df05b91cb69e66c4464a33378f not found: ID does not exist" Nov 26 15:17:20 crc kubenswrapper[5200]: I1126 15:17:20.024980 5200 scope.go:117] "RemoveContainer" containerID="bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b" Nov 26 15:17:20 crc kubenswrapper[5200]: E1126 15:17:20.025205 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b\": container with ID starting with bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b not found: ID does not exist" containerID="bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b" Nov 26 15:17:20 crc kubenswrapper[5200]: I1126 15:17:20.025228 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b"} err="failed to get container status \"bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b\": rpc error: code = NotFound desc = could not find container \"bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b\": container with ID starting with bd77012b69d85b56027b89f6807d81a8abe8cd9af1264e672529e76ca2a1ca8b not found: ID does not exist" Nov 26 15:17:20 crc kubenswrapper[5200]: I1126 15:17:20.997396 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" path="/var/lib/kubelet/pods/12b0eb92-bd49-4f77-bab3-aca9705a0845/volumes" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.932843 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q"] Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934508 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" containerName="init" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934531 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" containerName="init" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934552 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerName="extract-utilities" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934559 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerName="extract-utilities" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934578 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" containerName="dnsmasq-dns" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934584 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" containerName="dnsmasq-dns" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934599 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerName="registry-server" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934605 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerName="registry-server" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934611 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerName="extract-content" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934616 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerName="extract-content" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934834 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8acc51b7-8a56-4e06-a7e4-df94f2d2a551" containerName="dnsmasq-dns" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.934861 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="12b0eb92-bd49-4f77-bab3-aca9705a0845" containerName="registry-server" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.950735 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.955304 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.955439 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.961474 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q"] Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.961945 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:17:21 crc kubenswrapper[5200]: I1126 15:17:21.969443 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.069898 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.070049 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.070636 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.070786 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbsm\" (UniqueName: \"kubernetes.io/projected/63d2b0dc-1fe3-4f96-b48c-397b167f509f-kube-api-access-glbsm\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.071039 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.173450 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.173520 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.173699 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.173740 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glbsm\" (UniqueName: \"kubernetes.io/projected/63d2b0dc-1fe3-4f96-b48c-397b167f509f-kube-api-access-glbsm\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.173814 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.181125 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.182054 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.182288 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.185619 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.193148 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbsm\" (UniqueName: \"kubernetes.io/projected/63d2b0dc-1fe3-4f96-b48c-397b167f509f-kube-api-access-glbsm\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cz746q\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.296462 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.913294 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q"] Nov 26 15:17:22 crc kubenswrapper[5200]: I1126 15:17:22.986064 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" event={"ID":"63d2b0dc-1fe3-4f96-b48c-397b167f509f","Type":"ContainerStarted","Data":"841186981dfea83731f21906413a24f7c3498c474d7a6ae41769c7b4497271c8"} Nov 26 15:17:23 crc kubenswrapper[5200]: I1126 15:17:23.886500 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74dcc4787f-bkzrl" Nov 26 15:17:23 crc kubenswrapper[5200]: I1126 15:17:23.982864 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65945dc6f9-r4l8s"] Nov 26 15:17:23 crc kubenswrapper[5200]: I1126 15:17:23.983154 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" podUID="d55846ab-4971-47b4-a08c-57d019d52565" containerName="dnsmasq-dns" containerID="cri-o://ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c" gracePeriod=10 Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.588008 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.752346 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-dns-svc\") pod \"d55846ab-4971-47b4-a08c-57d019d52565\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.752529 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-sb\") pod \"d55846ab-4971-47b4-a08c-57d019d52565\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.752610 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8944\" (UniqueName: \"kubernetes.io/projected/d55846ab-4971-47b4-a08c-57d019d52565-kube-api-access-h8944\") pod \"d55846ab-4971-47b4-a08c-57d019d52565\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.752753 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-nb\") pod \"d55846ab-4971-47b4-a08c-57d019d52565\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.752772 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-config\") pod \"d55846ab-4971-47b4-a08c-57d019d52565\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.752847 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-openstack-cell1\") pod \"d55846ab-4971-47b4-a08c-57d019d52565\" (UID: \"d55846ab-4971-47b4-a08c-57d019d52565\") " Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.758923 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55846ab-4971-47b4-a08c-57d019d52565-kube-api-access-h8944" (OuterVolumeSpecName: "kube-api-access-h8944") pod "d55846ab-4971-47b4-a08c-57d019d52565" (UID: "d55846ab-4971-47b4-a08c-57d019d52565"). InnerVolumeSpecName "kube-api-access-h8944". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.811093 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d55846ab-4971-47b4-a08c-57d019d52565" (UID: "d55846ab-4971-47b4-a08c-57d019d52565"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.812781 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-config" (OuterVolumeSpecName: "config") pod "d55846ab-4971-47b4-a08c-57d019d52565" (UID: "d55846ab-4971-47b4-a08c-57d019d52565"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.817706 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "d55846ab-4971-47b4-a08c-57d019d52565" (UID: "d55846ab-4971-47b4-a08c-57d019d52565"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.817841 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d55846ab-4971-47b4-a08c-57d019d52565" (UID: "d55846ab-4971-47b4-a08c-57d019d52565"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.828440 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d55846ab-4971-47b4-a08c-57d019d52565" (UID: "d55846ab-4971-47b4-a08c-57d019d52565"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.856298 5200 reconciler_common.go:299] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-openstack-cell1\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.856352 5200 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.856362 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.856372 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8944\" (UniqueName: \"kubernetes.io/projected/d55846ab-4971-47b4-a08c-57d019d52565-kube-api-access-h8944\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.856384 5200 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:24 crc kubenswrapper[5200]: I1126 15:17:24.856392 5200 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55846ab-4971-47b4-a08c-57d019d52565-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.025695 5200 generic.go:358] "Generic (PLEG): container finished" podID="d55846ab-4971-47b4-a08c-57d019d52565" containerID="ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c" exitCode=0 Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.025817 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.025829 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" event={"ID":"d55846ab-4971-47b4-a08c-57d019d52565","Type":"ContainerDied","Data":"ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c"} Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.025906 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65945dc6f9-r4l8s" event={"ID":"d55846ab-4971-47b4-a08c-57d019d52565","Type":"ContainerDied","Data":"101ce92a28939bd517d58baafbb69f6948622b368a2c8ae169994c8052eead23"} Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.025932 5200 scope.go:117] "RemoveContainer" containerID="ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c" Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.064995 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65945dc6f9-r4l8s"] Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.077610 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65945dc6f9-r4l8s"] Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.081250 5200 scope.go:117] "RemoveContainer" containerID="d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473" Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.115881 5200 scope.go:117] "RemoveContainer" containerID="ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c" Nov 26 15:17:25 crc kubenswrapper[5200]: E1126 15:17:25.118241 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c\": container with ID starting with ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c not found: ID does not exist" containerID="ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c" Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.118284 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c"} err="failed to get container status \"ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c\": rpc error: code = NotFound desc = could not find container \"ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c\": container with ID starting with ba951d53927ee187f0335540bd9cb6bd5328b3c033de9e8b6bf7d1fcede4d39c not found: ID does not exist" Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.118308 5200 scope.go:117] "RemoveContainer" containerID="d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473" Nov 26 15:17:25 crc kubenswrapper[5200]: E1126 15:17:25.121151 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473\": container with ID starting with d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473 not found: ID does not exist" containerID="d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473" Nov 26 15:17:25 crc kubenswrapper[5200]: I1126 15:17:25.121237 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473"} err="failed to get container status \"d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473\": rpc error: code = NotFound desc = could not find container \"d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473\": container with ID starting with d42b77f8fb999d98284649893c79496d3109a7665a195cd8f1ab19e7c582e473 not found: ID does not exist" Nov 26 15:17:26 crc kubenswrapper[5200]: I1126 15:17:26.981862 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d55846ab-4971-47b4-a08c-57d019d52565" path="/var/lib/kubelet/pods/d55846ab-4971-47b4-a08c-57d019d52565/volumes" Nov 26 15:17:33 crc kubenswrapper[5200]: I1126 15:17:33.154568 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:17:33 crc kubenswrapper[5200]: I1126 15:17:33.155273 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:17:34 crc kubenswrapper[5200]: I1126 15:17:34.138915 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" event={"ID":"63d2b0dc-1fe3-4f96-b48c-397b167f509f","Type":"ContainerStarted","Data":"3faf11e9ccc3ad25b2c91ec8ec471fd32957c3ec177cbb8e60a3ee3ef5efc0e4"} Nov 26 15:17:34 crc kubenswrapper[5200]: I1126 15:17:34.162828 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" podStartSLOduration=2.812224322 podStartE2EDuration="13.162807511s" podCreationTimestamp="2025-11-26 15:17:21 +0000 UTC" firstStartedPulling="2025-11-26 15:17:22.917697803 +0000 UTC m=+6411.596899631" lastFinishedPulling="2025-11-26 15:17:33.268281002 +0000 UTC m=+6421.947482820" observedRunningTime="2025-11-26 15:17:34.158446044 +0000 UTC m=+6422.837647882" watchObservedRunningTime="2025-11-26 15:17:34.162807511 +0000 UTC m=+6422.842009339" Nov 26 15:17:42 crc kubenswrapper[5200]: E1126 15:17:42.817904 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2442116 actualBytes=10240 Nov 26 15:17:48 crc kubenswrapper[5200]: I1126 15:17:48.291493 5200 generic.go:358] "Generic (PLEG): container finished" podID="63d2b0dc-1fe3-4f96-b48c-397b167f509f" containerID="3faf11e9ccc3ad25b2c91ec8ec471fd32957c3ec177cbb8e60a3ee3ef5efc0e4" exitCode=0 Nov 26 15:17:48 crc kubenswrapper[5200]: I1126 15:17:48.291586 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" event={"ID":"63d2b0dc-1fe3-4f96-b48c-397b167f509f","Type":"ContainerDied","Data":"3faf11e9ccc3ad25b2c91ec8ec471fd32957c3ec177cbb8e60a3ee3ef5efc0e4"} Nov 26 15:17:49 crc kubenswrapper[5200]: I1126 15:17:49.977242 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.077551 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ceph\") pod \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.077644 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glbsm\" (UniqueName: \"kubernetes.io/projected/63d2b0dc-1fe3-4f96-b48c-397b167f509f-kube-api-access-glbsm\") pod \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.077786 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-inventory\") pod \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.077892 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-pre-adoption-validation-combined-ca-bundle\") pod \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.077931 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ssh-key\") pod \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\" (UID: \"63d2b0dc-1fe3-4f96-b48c-397b167f509f\") " Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.084652 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ceph" (OuterVolumeSpecName: "ceph") pod "63d2b0dc-1fe3-4f96-b48c-397b167f509f" (UID: "63d2b0dc-1fe3-4f96-b48c-397b167f509f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.085080 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "63d2b0dc-1fe3-4f96-b48c-397b167f509f" (UID: "63d2b0dc-1fe3-4f96-b48c-397b167f509f"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.086382 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d2b0dc-1fe3-4f96-b48c-397b167f509f-kube-api-access-glbsm" (OuterVolumeSpecName: "kube-api-access-glbsm") pod "63d2b0dc-1fe3-4f96-b48c-397b167f509f" (UID: "63d2b0dc-1fe3-4f96-b48c-397b167f509f"). InnerVolumeSpecName "kube-api-access-glbsm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.114028 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-inventory" (OuterVolumeSpecName: "inventory") pod "63d2b0dc-1fe3-4f96-b48c-397b167f509f" (UID: "63d2b0dc-1fe3-4f96-b48c-397b167f509f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.117657 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "63d2b0dc-1fe3-4f96-b48c-397b167f509f" (UID: "63d2b0dc-1fe3-4f96-b48c-397b167f509f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.180479 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.180530 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-glbsm\" (UniqueName: \"kubernetes.io/projected/63d2b0dc-1fe3-4f96-b48c-397b167f509f-kube-api-access-glbsm\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.180541 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.180552 5200 reconciler_common.go:299] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.180562 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63d2b0dc-1fe3-4f96-b48c-397b167f509f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.314983 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.314992 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cz746q" event={"ID":"63d2b0dc-1fe3-4f96-b48c-397b167f509f","Type":"ContainerDied","Data":"841186981dfea83731f21906413a24f7c3498c474d7a6ae41769c7b4497271c8"} Nov 26 15:17:50 crc kubenswrapper[5200]: I1126 15:17:50.315024 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="841186981dfea83731f21906413a24f7c3498c474d7a6ae41769c7b4497271c8" Nov 26 15:17:53 crc kubenswrapper[5200]: I1126 15:17:53.076177 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-zk9wv"] Nov 26 15:17:53 crc kubenswrapper[5200]: I1126 15:17:53.086777 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-zk9wv"] Nov 26 15:17:54 crc kubenswrapper[5200]: I1126 15:17:54.037542 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/octavia-436f-account-create-update-2vstr"] Nov 26 15:17:54 crc kubenswrapper[5200]: I1126 15:17:54.048410 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-436f-account-create-update-2vstr"] Nov 26 15:17:54 crc kubenswrapper[5200]: I1126 15:17:54.990414 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515c1e67-f058-4858-a438-6efc983b5a03" path="/var/lib/kubelet/pods/515c1e67-f058-4858-a438-6efc983b5a03/volumes" Nov 26 15:17:54 crc kubenswrapper[5200]: I1126 15:17:54.992006 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99476d44-82e6-494b-aa46-55e545bace8a" path="/var/lib/kubelet/pods/99476d44-82e6-494b-aa46-55e545bace8a/volumes" Nov 26 15:17:59 crc kubenswrapper[5200]: I1126 15:17:59.037036 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-g9nxx"] Nov 26 15:17:59 crc kubenswrapper[5200]: I1126 15:17:59.050116 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-g9nxx"] Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.038525 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/octavia-708c-account-create-update-p6m8q"] Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.048419 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-708c-account-create-update-p6m8q"] Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.170843 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv"] Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.186241 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d55846ab-4971-47b4-a08c-57d019d52565" containerName="dnsmasq-dns" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.186302 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55846ab-4971-47b4-a08c-57d019d52565" containerName="dnsmasq-dns" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.186373 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63d2b0dc-1fe3-4f96-b48c-397b167f509f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.186380 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d2b0dc-1fe3-4f96-b48c-397b167f509f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.186425 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d55846ab-4971-47b4-a08c-57d019d52565" containerName="init" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.186431 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55846ab-4971-47b4-a08c-57d019d52565" containerName="init" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.187097 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="63d2b0dc-1fe3-4f96-b48c-397b167f509f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.187141 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d55846ab-4971-47b4-a08c-57d019d52565" containerName="dnsmasq-dns" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.206486 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.211352 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.211887 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.213225 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.214093 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.217149 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv"] Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.331030 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rbcm\" (UniqueName: \"kubernetes.io/projected/180e251a-424a-4738-852d-1c2bb96044df-kube-api-access-6rbcm\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.331274 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.331402 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.331431 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.331454 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.434209 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rbcm\" (UniqueName: \"kubernetes.io/projected/180e251a-424a-4738-852d-1c2bb96044df-kube-api-access-6rbcm\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.434599 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.434876 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.435046 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.435174 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.440836 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.441052 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.441974 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.442305 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.454434 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rbcm\" (UniqueName: \"kubernetes.io/projected/180e251a-424a-4738-852d-1c2bb96044df-kube-api-access-6rbcm\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.531964 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.983565 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c787ca-1064-48e6-9360-b3d6e6321d86" path="/var/lib/kubelet/pods/11c787ca-1064-48e6-9360-b3d6e6321d86/volumes" Nov 26 15:18:00 crc kubenswrapper[5200]: I1126 15:18:00.984883 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d1cce1-73da-4750-a358-0dc637e7026d" path="/var/lib/kubelet/pods/49d1cce1-73da-4750-a358-0dc637e7026d/volumes" Nov 26 15:18:01 crc kubenswrapper[5200]: I1126 15:18:01.101101 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv"] Nov 26 15:18:01 crc kubenswrapper[5200]: I1126 15:18:01.442940 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" event={"ID":"180e251a-424a-4738-852d-1c2bb96044df","Type":"ContainerStarted","Data":"332f2df63ccc91bd987bd9522e3b417253cc54739ce6162cc6f824b83d0ddb45"} Nov 26 15:18:02 crc kubenswrapper[5200]: I1126 15:18:02.452806 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.154114 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.154498 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.154543 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.155401 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.155526 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" gracePeriod=600 Nov 26 15:18:03 crc kubenswrapper[5200]: E1126 15:18:03.281517 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.470555 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" exitCode=0 Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.470642 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525"} Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.470730 5200 scope.go:117] "RemoveContainer" containerID="1682d578e628d48f9c00a7583ad02f38b321739db47ca670781a2baf1284d737" Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.471526 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:18:03 crc kubenswrapper[5200]: E1126 15:18:03.472042 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.477993 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" event={"ID":"180e251a-424a-4738-852d-1c2bb96044df","Type":"ContainerStarted","Data":"5f37e84cc3a2b1e4c561f3df8b591f90ffba1fd75b19cff2c8481917406d3121"} Nov 26 15:18:03 crc kubenswrapper[5200]: I1126 15:18:03.521997 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" podStartSLOduration=2.178519219 podStartE2EDuration="3.521972982s" podCreationTimestamp="2025-11-26 15:18:00 +0000 UTC" firstStartedPulling="2025-11-26 15:18:01.10561084 +0000 UTC m=+6449.784812668" lastFinishedPulling="2025-11-26 15:18:02.449064613 +0000 UTC m=+6451.128266431" observedRunningTime="2025-11-26 15:18:03.503186337 +0000 UTC m=+6452.182388155" watchObservedRunningTime="2025-11-26 15:18:03.521972982 +0000 UTC m=+6452.201174800" Nov 26 15:18:17 crc kubenswrapper[5200]: I1126 15:18:17.972978 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:18:17 crc kubenswrapper[5200]: E1126 15:18:17.974223 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:18:28 crc kubenswrapper[5200]: I1126 15:18:28.970341 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:18:28 crc kubenswrapper[5200]: E1126 15:18:28.972003 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:18:33 crc kubenswrapper[5200]: I1126 15:18:33.254393 5200 scope.go:117] "RemoveContainer" containerID="01dbc8547202186d0bcaff29c4b0992d32225f8543c54be4cdc0efba2da80552" Nov 26 15:18:33 crc kubenswrapper[5200]: I1126 15:18:33.278789 5200 scope.go:117] "RemoveContainer" containerID="7c24a2c0fe463180eddd91c56dc8c84b5b19e7890e8297e6c1735a612e92e526" Nov 26 15:18:33 crc kubenswrapper[5200]: I1126 15:18:33.353207 5200 scope.go:117] "RemoveContainer" containerID="a136e76e3fd617c7b63332621e8d5c878a67a6a5a17be83c77a7a992153a1abc" Nov 26 15:18:33 crc kubenswrapper[5200]: I1126 15:18:33.387747 5200 scope.go:117] "RemoveContainer" containerID="91425ec885e2c8c75e89980389404ac7149e72fb82fa3b963a793ae3cea25aad" Nov 26 15:18:41 crc kubenswrapper[5200]: I1126 15:18:41.969785 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:18:41 crc kubenswrapper[5200]: E1126 15:18:41.970763 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:18:42 crc kubenswrapper[5200]: E1126 15:18:42.415309 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2442025 actualBytes=10240 Nov 26 15:18:52 crc kubenswrapper[5200]: I1126 15:18:52.052096 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-4wzlf"] Nov 26 15:18:52 crc kubenswrapper[5200]: I1126 15:18:52.064779 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-4wzlf"] Nov 26 15:18:52 crc kubenswrapper[5200]: I1126 15:18:52.981730 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bab952e-c8d7-4737-a59a-ef8c81bb6256" path="/var/lib/kubelet/pods/9bab952e-c8d7-4737-a59a-ef8c81bb6256/volumes" Nov 26 15:18:53 crc kubenswrapper[5200]: I1126 15:18:53.969890 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:18:53 crc kubenswrapper[5200]: E1126 15:18:53.970508 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:19:05 crc kubenswrapper[5200]: I1126 15:19:05.970275 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:19:05 crc kubenswrapper[5200]: E1126 15:19:05.971075 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:19:16 crc kubenswrapper[5200]: I1126 15:19:16.969301 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:19:16 crc kubenswrapper[5200]: E1126 15:19:16.970229 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:19:30 crc kubenswrapper[5200]: I1126 15:19:30.969945 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:19:30 crc kubenswrapper[5200]: E1126 15:19:30.971077 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:19:33 crc kubenswrapper[5200]: I1126 15:19:33.549309 5200 scope.go:117] "RemoveContainer" containerID="9d21a098df1541a5fc40993e87159e3d19972c6f3c61f5b30e2e9b7c54b5d57e" Nov 26 15:19:33 crc kubenswrapper[5200]: I1126 15:19:33.584106 5200 scope.go:117] "RemoveContainer" containerID="33bca999f0b0ac65592ec44212465f5970290b732cf7a16cb569690e64156937" Nov 26 15:19:42 crc kubenswrapper[5200]: E1126 15:19:42.550124 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2442016 actualBytes=10240 Nov 26 15:19:45 crc kubenswrapper[5200]: I1126 15:19:45.968991 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:19:45 crc kubenswrapper[5200]: E1126 15:19:45.969852 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:20:00 crc kubenswrapper[5200]: I1126 15:20:00.969875 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:20:00 crc kubenswrapper[5200]: E1126 15:20:00.970804 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:20:13 crc kubenswrapper[5200]: I1126 15:20:13.970262 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:20:13 crc kubenswrapper[5200]: E1126 15:20:13.970987 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:20:24 crc kubenswrapper[5200]: I1126 15:20:24.970909 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:20:24 crc kubenswrapper[5200]: E1126 15:20:24.971833 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:20:37 crc kubenswrapper[5200]: I1126 15:20:37.850064 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:20:37 crc kubenswrapper[5200]: I1126 15:20:37.850234 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:20:37 crc kubenswrapper[5200]: I1126 15:20:37.856750 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:20:37 crc kubenswrapper[5200]: I1126 15:20:37.856756 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:20:38 crc kubenswrapper[5200]: I1126 15:20:38.969753 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:20:38 crc kubenswrapper[5200]: E1126 15:20:38.970491 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:20:42 crc kubenswrapper[5200]: E1126 15:20:42.653222 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2442014 actualBytes=10240 Nov 26 15:20:53 crc kubenswrapper[5200]: I1126 15:20:53.969098 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:20:53 crc kubenswrapper[5200]: E1126 15:20:53.969944 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.614916 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wz9wq"] Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.629066 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.716998 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wz9wq"] Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.796593 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12846da6-f887-4fb0-867b-cd706ca8379a-utilities\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.797144 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12846da6-f887-4fb0-867b-cd706ca8379a-catalog-content\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.797398 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6jgg\" (UniqueName: \"kubernetes.io/projected/12846da6-f887-4fb0-867b-cd706ca8379a-kube-api-access-v6jgg\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.899783 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12846da6-f887-4fb0-867b-cd706ca8379a-utilities\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.899839 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12846da6-f887-4fb0-867b-cd706ca8379a-catalog-content\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.899875 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6jgg\" (UniqueName: \"kubernetes.io/projected/12846da6-f887-4fb0-867b-cd706ca8379a-kube-api-access-v6jgg\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.900315 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12846da6-f887-4fb0-867b-cd706ca8379a-utilities\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.900676 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12846da6-f887-4fb0-867b-cd706ca8379a-catalog-content\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.921614 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6jgg\" (UniqueName: \"kubernetes.io/projected/12846da6-f887-4fb0-867b-cd706ca8379a-kube-api-access-v6jgg\") pod \"community-operators-wz9wq\" (UID: \"12846da6-f887-4fb0-867b-cd706ca8379a\") " pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:57 crc kubenswrapper[5200]: I1126 15:20:57.960699 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:20:58 crc kubenswrapper[5200]: I1126 15:20:58.474299 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wz9wq"] Nov 26 15:20:58 crc kubenswrapper[5200]: W1126 15:20:58.476815 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12846da6_f887_4fb0_867b_cd706ca8379a.slice/crio-4f977d0f120a6bc9d8fe2fbaf5bb553ba94892284ad5b236cdea480907142ae8 WatchSource:0}: Error finding container 4f977d0f120a6bc9d8fe2fbaf5bb553ba94892284ad5b236cdea480907142ae8: Status 404 returned error can't find the container with id 4f977d0f120a6bc9d8fe2fbaf5bb553ba94892284ad5b236cdea480907142ae8 Nov 26 15:20:58 crc kubenswrapper[5200]: I1126 15:20:58.478804 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:20:58 crc kubenswrapper[5200]: I1126 15:20:58.781182 5200 generic.go:358] "Generic (PLEG): container finished" podID="12846da6-f887-4fb0-867b-cd706ca8379a" containerID="71ad434be6d20dec28804ada771eb8b558e0986044ad9ddc9db11987959e42be" exitCode=0 Nov 26 15:20:58 crc kubenswrapper[5200]: I1126 15:20:58.781259 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wz9wq" event={"ID":"12846da6-f887-4fb0-867b-cd706ca8379a","Type":"ContainerDied","Data":"71ad434be6d20dec28804ada771eb8b558e0986044ad9ddc9db11987959e42be"} Nov 26 15:20:58 crc kubenswrapper[5200]: I1126 15:20:58.781633 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wz9wq" event={"ID":"12846da6-f887-4fb0-867b-cd706ca8379a","Type":"ContainerStarted","Data":"4f977d0f120a6bc9d8fe2fbaf5bb553ba94892284ad5b236cdea480907142ae8"} Nov 26 15:21:01 crc kubenswrapper[5200]: I1126 15:21:01.815516 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wz9wq" event={"ID":"12846da6-f887-4fb0-867b-cd706ca8379a","Type":"ContainerStarted","Data":"d62584919687c04a2c46f231b9f1327b45d9057fdb9e655482dcf531e181dabf"} Nov 26 15:21:02 crc kubenswrapper[5200]: I1126 15:21:02.826557 5200 generic.go:358] "Generic (PLEG): container finished" podID="12846da6-f887-4fb0-867b-cd706ca8379a" containerID="d62584919687c04a2c46f231b9f1327b45d9057fdb9e655482dcf531e181dabf" exitCode=0 Nov 26 15:21:02 crc kubenswrapper[5200]: I1126 15:21:02.826724 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wz9wq" event={"ID":"12846da6-f887-4fb0-867b-cd706ca8379a","Type":"ContainerDied","Data":"d62584919687c04a2c46f231b9f1327b45d9057fdb9e655482dcf531e181dabf"} Nov 26 15:21:03 crc kubenswrapper[5200]: I1126 15:21:03.838714 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wz9wq" event={"ID":"12846da6-f887-4fb0-867b-cd706ca8379a","Type":"ContainerStarted","Data":"5c431d00dabb815a8303f513954b9631181cf4ac0171c15d2de28d67aa8abe9b"} Nov 26 15:21:03 crc kubenswrapper[5200]: I1126 15:21:03.857265 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wz9wq" podStartSLOduration=4.10254385 podStartE2EDuration="6.857245976s" podCreationTimestamp="2025-11-26 15:20:57 +0000 UTC" firstStartedPulling="2025-11-26 15:20:58.783317837 +0000 UTC m=+6627.462519695" lastFinishedPulling="2025-11-26 15:21:01.538020003 +0000 UTC m=+6630.217221821" observedRunningTime="2025-11-26 15:21:03.855371355 +0000 UTC m=+6632.534573183" watchObservedRunningTime="2025-11-26 15:21:03.857245976 +0000 UTC m=+6632.536447804" Nov 26 15:21:07 crc kubenswrapper[5200]: I1126 15:21:07.961881 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:21:07 crc kubenswrapper[5200]: I1126 15:21:07.962495 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:21:08 crc kubenswrapper[5200]: I1126 15:21:08.007797 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:21:08 crc kubenswrapper[5200]: I1126 15:21:08.938350 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wz9wq" Nov 26 15:21:08 crc kubenswrapper[5200]: I1126 15:21:08.969472 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:21:08 crc kubenswrapper[5200]: E1126 15:21:08.969991 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:21:09 crc kubenswrapper[5200]: I1126 15:21:09.071586 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wz9wq"] Nov 26 15:21:09 crc kubenswrapper[5200]: I1126 15:21:09.135196 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f29zg"] Nov 26 15:21:09 crc kubenswrapper[5200]: I1126 15:21:09.135496 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f29zg" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerName="registry-server" containerID="cri-o://271baf04f9b1184de2b53ac3db2448e1f9fe3d751ed9c3fdb379ea736aefe285" gracePeriod=2 Nov 26 15:21:09 crc kubenswrapper[5200]: I1126 15:21:09.898002 5200 generic.go:358] "Generic (PLEG): container finished" podID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerID="271baf04f9b1184de2b53ac3db2448e1f9fe3d751ed9c3fdb379ea736aefe285" exitCode=0 Nov 26 15:21:09 crc kubenswrapper[5200]: I1126 15:21:09.898056 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f29zg" event={"ID":"bc17dc43-8f4d-4fce-af04-702d98df7231","Type":"ContainerDied","Data":"271baf04f9b1184de2b53ac3db2448e1f9fe3d751ed9c3fdb379ea736aefe285"} Nov 26 15:21:10 crc kubenswrapper[5200]: I1126 15:21:10.847091 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f29zg" Nov 26 15:21:10 crc kubenswrapper[5200]: I1126 15:21:10.913552 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f29zg" Nov 26 15:21:10 crc kubenswrapper[5200]: I1126 15:21:10.913971 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f29zg" event={"ID":"bc17dc43-8f4d-4fce-af04-702d98df7231","Type":"ContainerDied","Data":"7e4005ba40d6f834ccb09b0e0b712841e369fe2b633b5773a7f94d920d506fd5"} Nov 26 15:21:10 crc kubenswrapper[5200]: I1126 15:21:10.914008 5200 scope.go:117] "RemoveContainer" containerID="271baf04f9b1184de2b53ac3db2448e1f9fe3d751ed9c3fdb379ea736aefe285" Nov 26 15:21:10 crc kubenswrapper[5200]: I1126 15:21:10.946621 5200 scope.go:117] "RemoveContainer" containerID="d6fba336bb4a3fcdb552d6d30950da5c8b9b1bbb78cadab24b3598b9080ddc95" Nov 26 15:21:10 crc kubenswrapper[5200]: I1126 15:21:10.974664 5200 scope.go:117] "RemoveContainer" containerID="90c949097da57d8a36f09563fb7602ed9432606ea5e919a471e029a24555c577" Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.019098 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-utilities\") pod \"bc17dc43-8f4d-4fce-af04-702d98df7231\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.019278 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-catalog-content\") pod \"bc17dc43-8f4d-4fce-af04-702d98df7231\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.019312 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds62d\" (UniqueName: \"kubernetes.io/projected/bc17dc43-8f4d-4fce-af04-702d98df7231-kube-api-access-ds62d\") pod \"bc17dc43-8f4d-4fce-af04-702d98df7231\" (UID: \"bc17dc43-8f4d-4fce-af04-702d98df7231\") " Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.019880 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-utilities" (OuterVolumeSpecName: "utilities") pod "bc17dc43-8f4d-4fce-af04-702d98df7231" (UID: "bc17dc43-8f4d-4fce-af04-702d98df7231"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.021553 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.025731 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc17dc43-8f4d-4fce-af04-702d98df7231-kube-api-access-ds62d" (OuterVolumeSpecName: "kube-api-access-ds62d") pod "bc17dc43-8f4d-4fce-af04-702d98df7231" (UID: "bc17dc43-8f4d-4fce-af04-702d98df7231"). InnerVolumeSpecName "kube-api-access-ds62d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.063459 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc17dc43-8f4d-4fce-af04-702d98df7231" (UID: "bc17dc43-8f4d-4fce-af04-702d98df7231"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.123618 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc17dc43-8f4d-4fce-af04-702d98df7231-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.123663 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ds62d\" (UniqueName: \"kubernetes.io/projected/bc17dc43-8f4d-4fce-af04-702d98df7231-kube-api-access-ds62d\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.248170 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f29zg"] Nov 26 15:21:11 crc kubenswrapper[5200]: I1126 15:21:11.256701 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f29zg"] Nov 26 15:21:12 crc kubenswrapper[5200]: I1126 15:21:12.989417 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" path="/var/lib/kubelet/pods/bc17dc43-8f4d-4fce-af04-702d98df7231/volumes" Nov 26 15:21:19 crc kubenswrapper[5200]: I1126 15:21:19.970229 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:21:19 crc kubenswrapper[5200]: E1126 15:21:19.971236 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:21:32 crc kubenswrapper[5200]: I1126 15:21:32.981906 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:21:32 crc kubenswrapper[5200]: E1126 15:21:32.984435 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:21:42 crc kubenswrapper[5200]: E1126 15:21:42.698496 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443334 actualBytes=10240 Nov 26 15:21:45 crc kubenswrapper[5200]: I1126 15:21:45.969754 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:21:45 crc kubenswrapper[5200]: E1126 15:21:45.970875 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:21:57 crc kubenswrapper[5200]: I1126 15:21:57.969723 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:21:57 crc kubenswrapper[5200]: E1126 15:21:57.970677 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:22:12 crc kubenswrapper[5200]: I1126 15:22:12.993728 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:22:12 crc kubenswrapper[5200]: E1126 15:22:12.994591 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:22:27 crc kubenswrapper[5200]: I1126 15:22:27.969588 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:22:27 crc kubenswrapper[5200]: E1126 15:22:27.970540 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:22:41 crc kubenswrapper[5200]: I1126 15:22:41.970192 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:22:41 crc kubenswrapper[5200]: E1126 15:22:41.971169 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:22:42 crc kubenswrapper[5200]: E1126 15:22:42.563941 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443329 actualBytes=10240 Nov 26 15:22:45 crc kubenswrapper[5200]: I1126 15:22:45.062275 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-8cbf7"] Nov 26 15:22:45 crc kubenswrapper[5200]: I1126 15:22:45.097966 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/heat-34b5-account-create-update-sjvgl"] Nov 26 15:22:45 crc kubenswrapper[5200]: I1126 15:22:45.106703 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-8cbf7"] Nov 26 15:22:45 crc kubenswrapper[5200]: I1126 15:22:45.114062 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/heat-34b5-account-create-update-sjvgl"] Nov 26 15:22:46 crc kubenswrapper[5200]: I1126 15:22:46.984834 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79392be7-8cc7-4cda-90ff-0ef8d41597a0" path="/var/lib/kubelet/pods/79392be7-8cc7-4cda-90ff-0ef8d41597a0/volumes" Nov 26 15:22:46 crc kubenswrapper[5200]: I1126 15:22:46.986556 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d82eee2-b417-48ba-ac6c-51129e481b5f" path="/var/lib/kubelet/pods/8d82eee2-b417-48ba-ac6c-51129e481b5f/volumes" Nov 26 15:22:53 crc kubenswrapper[5200]: I1126 15:22:53.969733 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:22:53 crc kubenswrapper[5200]: E1126 15:22:53.970528 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:22:58 crc kubenswrapper[5200]: I1126 15:22:58.046248 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-4vfd4"] Nov 26 15:22:58 crc kubenswrapper[5200]: I1126 15:22:58.055254 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-4vfd4"] Nov 26 15:22:58 crc kubenswrapper[5200]: I1126 15:22:58.983120 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52475b3-abc6-48c1-ac1c-d351f7abf5b1" path="/var/lib/kubelet/pods/f52475b3-abc6-48c1-ac1c-d351f7abf5b1/volumes" Nov 26 15:23:06 crc kubenswrapper[5200]: I1126 15:23:06.970111 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:23:08 crc kubenswrapper[5200]: I1126 15:23:08.178436 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"060c735b3c6b91a7b59f706f13708c0f59f974db6c9c74cdcd8ef34b9041e93c"} Nov 26 15:23:33 crc kubenswrapper[5200]: I1126 15:23:33.776621 5200 scope.go:117] "RemoveContainer" containerID="30d190782117914a809c0102acd0e7fdaf534f467bff912bc662541476bc4a22" Nov 26 15:23:33 crc kubenswrapper[5200]: I1126 15:23:33.856063 5200 scope.go:117] "RemoveContainer" containerID="c9b1e8020d2ad944ca4de4276a99497cb27a05da868b8098fafab7020172c776" Nov 26 15:23:33 crc kubenswrapper[5200]: I1126 15:23:33.890754 5200 scope.go:117] "RemoveContainer" containerID="c9b149d661751e2c373e21272555502885bb03d20e9bf2d86571ccced1d07788" Nov 26 15:23:42 crc kubenswrapper[5200]: E1126 15:23:42.603271 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443329 actualBytes=10240 Nov 26 15:24:42 crc kubenswrapper[5200]: E1126 15:24:42.617381 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441988 actualBytes=10240 Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.528952 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8c5vp"] Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.534551 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerName="extract-content" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.534583 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerName="extract-content" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.534600 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerName="registry-server" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.534606 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerName="registry-server" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.534625 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerName="extract-utilities" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.534631 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerName="extract-utilities" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.534852 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc17dc43-8f4d-4fce-af04-702d98df7231" containerName="registry-server" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.542384 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.544518 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8c5vp"] Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.595421 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-catalog-content\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.595508 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2289\" (UniqueName: \"kubernetes.io/projected/20f47c26-ed9e-46c8-864d-185b272044f7-kube-api-access-l2289\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.595554 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-utilities\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.697463 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-catalog-content\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.697560 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2289\" (UniqueName: \"kubernetes.io/projected/20f47c26-ed9e-46c8-864d-185b272044f7-kube-api-access-l2289\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.697615 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-utilities\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.698279 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-utilities\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.698547 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-catalog-content\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.733282 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2289\" (UniqueName: \"kubernetes.io/projected/20f47c26-ed9e-46c8-864d-185b272044f7-kube-api-access-l2289\") pod \"redhat-operators-8c5vp\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:58 crc kubenswrapper[5200]: I1126 15:24:58.871523 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:24:59 crc kubenswrapper[5200]: W1126 15:24:59.402775 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f47c26_ed9e_46c8_864d_185b272044f7.slice/crio-279512a6a1b00ec28e865e54842aca2d5948bbbc326515725eadd3cf92000f81 WatchSource:0}: Error finding container 279512a6a1b00ec28e865e54842aca2d5948bbbc326515725eadd3cf92000f81: Status 404 returned error can't find the container with id 279512a6a1b00ec28e865e54842aca2d5948bbbc326515725eadd3cf92000f81 Nov 26 15:24:59 crc kubenswrapper[5200]: I1126 15:24:59.408610 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8c5vp"] Nov 26 15:25:00 crc kubenswrapper[5200]: I1126 15:25:00.408599 5200 generic.go:358] "Generic (PLEG): container finished" podID="20f47c26-ed9e-46c8-864d-185b272044f7" containerID="7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a" exitCode=0 Nov 26 15:25:00 crc kubenswrapper[5200]: I1126 15:25:00.408653 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c5vp" event={"ID":"20f47c26-ed9e-46c8-864d-185b272044f7","Type":"ContainerDied","Data":"7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a"} Nov 26 15:25:00 crc kubenswrapper[5200]: I1126 15:25:00.408730 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c5vp" event={"ID":"20f47c26-ed9e-46c8-864d-185b272044f7","Type":"ContainerStarted","Data":"279512a6a1b00ec28e865e54842aca2d5948bbbc326515725eadd3cf92000f81"} Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.131931 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qdk"] Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.139161 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qdk"] Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.139308 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.278518 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-utilities\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.278779 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-catalog-content\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.278891 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7528q\" (UniqueName: \"kubernetes.io/projected/e5bec326-8370-4872-9a81-a9aee249006c-kube-api-access-7528q\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.380572 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-utilities\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.380657 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-catalog-content\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.380719 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7528q\" (UniqueName: \"kubernetes.io/projected/e5bec326-8370-4872-9a81-a9aee249006c-kube-api-access-7528q\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.381119 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-utilities\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.381334 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-catalog-content\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.400186 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7528q\" (UniqueName: \"kubernetes.io/projected/e5bec326-8370-4872-9a81-a9aee249006c-kube-api-access-7528q\") pod \"redhat-marketplace-w9qdk\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.494851 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:01 crc kubenswrapper[5200]: I1126 15:25:01.964654 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qdk"] Nov 26 15:25:02 crc kubenswrapper[5200]: I1126 15:25:02.430712 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c5vp" event={"ID":"20f47c26-ed9e-46c8-864d-185b272044f7","Type":"ContainerStarted","Data":"848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc"} Nov 26 15:25:02 crc kubenswrapper[5200]: I1126 15:25:02.435316 5200 generic.go:358] "Generic (PLEG): container finished" podID="e5bec326-8370-4872-9a81-a9aee249006c" containerID="0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af" exitCode=0 Nov 26 15:25:02 crc kubenswrapper[5200]: I1126 15:25:02.435394 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qdk" event={"ID":"e5bec326-8370-4872-9a81-a9aee249006c","Type":"ContainerDied","Data":"0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af"} Nov 26 15:25:02 crc kubenswrapper[5200]: I1126 15:25:02.435436 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qdk" event={"ID":"e5bec326-8370-4872-9a81-a9aee249006c","Type":"ContainerStarted","Data":"6a5def2c1f8d554a079a3d94a8ca1e5e08d3e2bbad1a352f7ecd174bdcf2839c"} Nov 26 15:25:04 crc kubenswrapper[5200]: I1126 15:25:04.062077 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-mtvqr"] Nov 26 15:25:04 crc kubenswrapper[5200]: I1126 15:25:04.078408 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/aodh-62f1-account-create-update-ncmvr"] Nov 26 15:25:04 crc kubenswrapper[5200]: I1126 15:25:04.089824 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-62f1-account-create-update-ncmvr"] Nov 26 15:25:04 crc kubenswrapper[5200]: I1126 15:25:04.103590 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-mtvqr"] Nov 26 15:25:04 crc kubenswrapper[5200]: I1126 15:25:04.981822 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552" path="/var/lib/kubelet/pods/29c00c0c-d2c6-48d4-a6a5-3bbc89c7c552/volumes" Nov 26 15:25:04 crc kubenswrapper[5200]: I1126 15:25:04.983057 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751ff29d-e93e-457c-a768-cdce21f92e33" path="/var/lib/kubelet/pods/751ff29d-e93e-457c-a768-cdce21f92e33/volumes" Nov 26 15:25:05 crc kubenswrapper[5200]: I1126 15:25:05.472641 5200 generic.go:358] "Generic (PLEG): container finished" podID="e5bec326-8370-4872-9a81-a9aee249006c" containerID="25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1" exitCode=0 Nov 26 15:25:05 crc kubenswrapper[5200]: I1126 15:25:05.472778 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qdk" event={"ID":"e5bec326-8370-4872-9a81-a9aee249006c","Type":"ContainerDied","Data":"25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1"} Nov 26 15:25:06 crc kubenswrapper[5200]: I1126 15:25:06.485198 5200 generic.go:358] "Generic (PLEG): container finished" podID="20f47c26-ed9e-46c8-864d-185b272044f7" containerID="848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc" exitCode=0 Nov 26 15:25:06 crc kubenswrapper[5200]: I1126 15:25:06.485262 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c5vp" event={"ID":"20f47c26-ed9e-46c8-864d-185b272044f7","Type":"ContainerDied","Data":"848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc"} Nov 26 15:25:06 crc kubenswrapper[5200]: I1126 15:25:06.492090 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qdk" event={"ID":"e5bec326-8370-4872-9a81-a9aee249006c","Type":"ContainerStarted","Data":"44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1"} Nov 26 15:25:06 crc kubenswrapper[5200]: I1126 15:25:06.526335 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w9qdk" podStartSLOduration=3.525766423 podStartE2EDuration="5.526309965s" podCreationTimestamp="2025-11-26 15:25:01 +0000 UTC" firstStartedPulling="2025-11-26 15:25:02.436348512 +0000 UTC m=+6871.115550330" lastFinishedPulling="2025-11-26 15:25:04.436892034 +0000 UTC m=+6873.116093872" observedRunningTime="2025-11-26 15:25:06.519142085 +0000 UTC m=+6875.198343923" watchObservedRunningTime="2025-11-26 15:25:06.526309965 +0000 UTC m=+6875.205511783" Nov 26 15:25:07 crc kubenswrapper[5200]: I1126 15:25:07.505114 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c5vp" event={"ID":"20f47c26-ed9e-46c8-864d-185b272044f7","Type":"ContainerStarted","Data":"1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96"} Nov 26 15:25:07 crc kubenswrapper[5200]: I1126 15:25:07.526301 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8c5vp" podStartSLOduration=8.209616823 podStartE2EDuration="9.526280078s" podCreationTimestamp="2025-11-26 15:24:58 +0000 UTC" firstStartedPulling="2025-11-26 15:25:00.409861074 +0000 UTC m=+6869.089062892" lastFinishedPulling="2025-11-26 15:25:01.726524329 +0000 UTC m=+6870.405726147" observedRunningTime="2025-11-26 15:25:07.521111901 +0000 UTC m=+6876.200313729" watchObservedRunningTime="2025-11-26 15:25:07.526280078 +0000 UTC m=+6876.205481896" Nov 26 15:25:08 crc kubenswrapper[5200]: I1126 15:25:08.872298 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:25:08 crc kubenswrapper[5200]: I1126 15:25:08.872643 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:25:09 crc kubenswrapper[5200]: I1126 15:25:09.922760 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8c5vp" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="registry-server" probeResult="failure" output=< Nov 26 15:25:09 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:25:09 crc kubenswrapper[5200]: > Nov 26 15:25:11 crc kubenswrapper[5200]: I1126 15:25:11.494974 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:11 crc kubenswrapper[5200]: I1126 15:25:11.495340 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:11 crc kubenswrapper[5200]: I1126 15:25:11.552868 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:11 crc kubenswrapper[5200]: I1126 15:25:11.626282 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:11 crc kubenswrapper[5200]: I1126 15:25:11.803344 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qdk"] Nov 26 15:25:13 crc kubenswrapper[5200]: I1126 15:25:13.562568 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w9qdk" podUID="e5bec326-8370-4872-9a81-a9aee249006c" containerName="registry-server" containerID="cri-o://44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1" gracePeriod=2 Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.127428 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.233072 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7528q\" (UniqueName: \"kubernetes.io/projected/e5bec326-8370-4872-9a81-a9aee249006c-kube-api-access-7528q\") pod \"e5bec326-8370-4872-9a81-a9aee249006c\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.233238 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-catalog-content\") pod \"e5bec326-8370-4872-9a81-a9aee249006c\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.233374 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-utilities\") pod \"e5bec326-8370-4872-9a81-a9aee249006c\" (UID: \"e5bec326-8370-4872-9a81-a9aee249006c\") " Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.234513 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-utilities" (OuterVolumeSpecName: "utilities") pod "e5bec326-8370-4872-9a81-a9aee249006c" (UID: "e5bec326-8370-4872-9a81-a9aee249006c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.243763 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5bec326-8370-4872-9a81-a9aee249006c" (UID: "e5bec326-8370-4872-9a81-a9aee249006c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.247694 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bec326-8370-4872-9a81-a9aee249006c-kube-api-access-7528q" (OuterVolumeSpecName: "kube-api-access-7528q") pod "e5bec326-8370-4872-9a81-a9aee249006c" (UID: "e5bec326-8370-4872-9a81-a9aee249006c"). InnerVolumeSpecName "kube-api-access-7528q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.336551 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7528q\" (UniqueName: \"kubernetes.io/projected/e5bec326-8370-4872-9a81-a9aee249006c-kube-api-access-7528q\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.336872 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.337037 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bec326-8370-4872-9a81-a9aee249006c-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.573527 5200 generic.go:358] "Generic (PLEG): container finished" podID="e5bec326-8370-4872-9a81-a9aee249006c" containerID="44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1" exitCode=0 Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.573960 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qdk" event={"ID":"e5bec326-8370-4872-9a81-a9aee249006c","Type":"ContainerDied","Data":"44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1"} Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.575601 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9qdk" event={"ID":"e5bec326-8370-4872-9a81-a9aee249006c","Type":"ContainerDied","Data":"6a5def2c1f8d554a079a3d94a8ca1e5e08d3e2bbad1a352f7ecd174bdcf2839c"} Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.574097 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9qdk" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.575731 5200 scope.go:117] "RemoveContainer" containerID="44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.610225 5200 scope.go:117] "RemoveContainer" containerID="25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.619416 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qdk"] Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.631110 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9qdk"] Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.636064 5200 scope.go:117] "RemoveContainer" containerID="0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.690244 5200 scope.go:117] "RemoveContainer" containerID="44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1" Nov 26 15:25:14 crc kubenswrapper[5200]: E1126 15:25:14.690809 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1\": container with ID starting with 44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1 not found: ID does not exist" containerID="44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.690858 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1"} err="failed to get container status \"44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1\": rpc error: code = NotFound desc = could not find container \"44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1\": container with ID starting with 44d7bfa1bbf5856f4ad3e2f1cda9d1661abd793b957b777f3a9d4d011b5a98a1 not found: ID does not exist" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.690883 5200 scope.go:117] "RemoveContainer" containerID="25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1" Nov 26 15:25:14 crc kubenswrapper[5200]: E1126 15:25:14.691157 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1\": container with ID starting with 25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1 not found: ID does not exist" containerID="25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.691186 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1"} err="failed to get container status \"25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1\": rpc error: code = NotFound desc = could not find container \"25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1\": container with ID starting with 25c020f995a78dbed40c647c22764f73db92d402721c9dac5fd12900b471d8b1 not found: ID does not exist" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.691200 5200 scope.go:117] "RemoveContainer" containerID="0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af" Nov 26 15:25:14 crc kubenswrapper[5200]: E1126 15:25:14.691536 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af\": container with ID starting with 0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af not found: ID does not exist" containerID="0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.691568 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af"} err="failed to get container status \"0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af\": rpc error: code = NotFound desc = could not find container \"0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af\": container with ID starting with 0bf40af570ea742987aff4ef9c6ebe5c2259f28e3d41f28b7a02ea922c51d2af not found: ID does not exist" Nov 26 15:25:14 crc kubenswrapper[5200]: I1126 15:25:14.987084 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bec326-8370-4872-9a81-a9aee249006c" path="/var/lib/kubelet/pods/e5bec326-8370-4872-9a81-a9aee249006c/volumes" Nov 26 15:25:16 crc kubenswrapper[5200]: I1126 15:25:16.033506 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-fxp72"] Nov 26 15:25:16 crc kubenswrapper[5200]: I1126 15:25:16.045099 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-fxp72"] Nov 26 15:25:16 crc kubenswrapper[5200]: I1126 15:25:16.982432 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc8058e-4b34-41fd-893f-a35dd98059fd" path="/var/lib/kubelet/pods/dcc8058e-4b34-41fd-893f-a35dd98059fd/volumes" Nov 26 15:25:18 crc kubenswrapper[5200]: I1126 15:25:18.918864 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:25:18 crc kubenswrapper[5200]: I1126 15:25:18.986659 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:25:19 crc kubenswrapper[5200]: I1126 15:25:19.152601 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8c5vp"] Nov 26 15:25:20 crc kubenswrapper[5200]: I1126 15:25:20.641197 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8c5vp" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="registry-server" containerID="cri-o://1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96" gracePeriod=2 Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.170507 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.238457 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-utilities\") pod \"20f47c26-ed9e-46c8-864d-185b272044f7\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.238605 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2289\" (UniqueName: \"kubernetes.io/projected/20f47c26-ed9e-46c8-864d-185b272044f7-kube-api-access-l2289\") pod \"20f47c26-ed9e-46c8-864d-185b272044f7\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.239091 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-catalog-content\") pod \"20f47c26-ed9e-46c8-864d-185b272044f7\" (UID: \"20f47c26-ed9e-46c8-864d-185b272044f7\") " Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.240738 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-utilities" (OuterVolumeSpecName: "utilities") pod "20f47c26-ed9e-46c8-864d-185b272044f7" (UID: "20f47c26-ed9e-46c8-864d-185b272044f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.246009 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f47c26-ed9e-46c8-864d-185b272044f7-kube-api-access-l2289" (OuterVolumeSpecName: "kube-api-access-l2289") pod "20f47c26-ed9e-46c8-864d-185b272044f7" (UID: "20f47c26-ed9e-46c8-864d-185b272044f7"). InnerVolumeSpecName "kube-api-access-l2289". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.341311 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.341570 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2289\" (UniqueName: \"kubernetes.io/projected/20f47c26-ed9e-46c8-864d-185b272044f7-kube-api-access-l2289\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.344739 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20f47c26-ed9e-46c8-864d-185b272044f7" (UID: "20f47c26-ed9e-46c8-864d-185b272044f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.451443 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20f47c26-ed9e-46c8-864d-185b272044f7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.655170 5200 generic.go:358] "Generic (PLEG): container finished" podID="20f47c26-ed9e-46c8-864d-185b272044f7" containerID="1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96" exitCode=0 Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.655487 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c5vp" event={"ID":"20f47c26-ed9e-46c8-864d-185b272044f7","Type":"ContainerDied","Data":"1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96"} Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.655519 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8c5vp" event={"ID":"20f47c26-ed9e-46c8-864d-185b272044f7","Type":"ContainerDied","Data":"279512a6a1b00ec28e865e54842aca2d5948bbbc326515725eadd3cf92000f81"} Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.655536 5200 scope.go:117] "RemoveContainer" containerID="1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.655774 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8c5vp" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.680992 5200 scope.go:117] "RemoveContainer" containerID="848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.696774 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8c5vp"] Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.708511 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8c5vp"] Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.715742 5200 scope.go:117] "RemoveContainer" containerID="7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.750033 5200 scope.go:117] "RemoveContainer" containerID="1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96" Nov 26 15:25:21 crc kubenswrapper[5200]: E1126 15:25:21.750880 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96\": container with ID starting with 1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96 not found: ID does not exist" containerID="1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.750917 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96"} err="failed to get container status \"1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96\": rpc error: code = NotFound desc = could not find container \"1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96\": container with ID starting with 1e0ba040235c5adb8df4a028aecbcde46f9be92411789af3f1122091fb9b2c96 not found: ID does not exist" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.751083 5200 scope.go:117] "RemoveContainer" containerID="848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc" Nov 26 15:25:21 crc kubenswrapper[5200]: E1126 15:25:21.751506 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc\": container with ID starting with 848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc not found: ID does not exist" containerID="848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.751536 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc"} err="failed to get container status \"848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc\": rpc error: code = NotFound desc = could not find container \"848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc\": container with ID starting with 848a8e55458a0f422e0d224229044b7da895ece07d6eb8fefdc7980785b292fc not found: ID does not exist" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.751557 5200 scope.go:117] "RemoveContainer" containerID="7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a" Nov 26 15:25:21 crc kubenswrapper[5200]: E1126 15:25:21.751783 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a\": container with ID starting with 7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a not found: ID does not exist" containerID="7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a" Nov 26 15:25:21 crc kubenswrapper[5200]: I1126 15:25:21.751799 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a"} err="failed to get container status \"7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a\": rpc error: code = NotFound desc = could not find container \"7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a\": container with ID starting with 7106ef93658ce8afa33fb0c4f6788a7087833e6eb59d78c667163a55db5d508a not found: ID does not exist" Nov 26 15:25:22 crc kubenswrapper[5200]: I1126 15:25:22.995338 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" path="/var/lib/kubelet/pods/20f47c26-ed9e-46c8-864d-185b272044f7/volumes" Nov 26 15:25:33 crc kubenswrapper[5200]: I1126 15:25:33.154317 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:25:33 crc kubenswrapper[5200]: I1126 15:25:33.154892 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:25:34 crc kubenswrapper[5200]: I1126 15:25:34.071117 5200 scope.go:117] "RemoveContainer" containerID="8dcfe268fec275ba9af34029bc57e899701a6f48aadebd974af72142a35e637e" Nov 26 15:25:34 crc kubenswrapper[5200]: I1126 15:25:34.106144 5200 scope.go:117] "RemoveContainer" containerID="405e3e4dcf142cc3b989abdaf6f9edccb02a96f539f6fd49db0543441246a39f" Nov 26 15:25:34 crc kubenswrapper[5200]: I1126 15:25:34.192115 5200 scope.go:117] "RemoveContainer" containerID="254e39fb52b94a5955381f88436be52afdd8f843264fd6c308ce8cadec8f19f6" Nov 26 15:25:37 crc kubenswrapper[5200]: I1126 15:25:37.994000 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:25:38 crc kubenswrapper[5200]: I1126 15:25:38.000224 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:25:38 crc kubenswrapper[5200]: I1126 15:25:38.004128 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:25:38 crc kubenswrapper[5200]: I1126 15:25:38.004190 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:25:39 crc kubenswrapper[5200]: I1126 15:25:39.044844 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/manila-83fd-account-create-update-h5bvf"] Nov 26 15:25:39 crc kubenswrapper[5200]: I1126 15:25:39.058180 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/manila-83fd-account-create-update-h5bvf"] Nov 26 15:25:40 crc kubenswrapper[5200]: I1126 15:25:40.025389 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-5nj7t"] Nov 26 15:25:40 crc kubenswrapper[5200]: I1126 15:25:40.035067 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-5nj7t"] Nov 26 15:25:40 crc kubenswrapper[5200]: I1126 15:25:40.984071 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ca5afe-27f4-40ac-abec-e35e45e94345" path="/var/lib/kubelet/pods/79ca5afe-27f4-40ac-abec-e35e45e94345/volumes" Nov 26 15:25:40 crc kubenswrapper[5200]: I1126 15:25:40.985224 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd" path="/var/lib/kubelet/pods/87cc06a3-07fc-407c-91bf-0e5bfa8d2bcd/volumes" Nov 26 15:25:42 crc kubenswrapper[5200]: E1126 15:25:42.879294 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441986 actualBytes=10240 Nov 26 15:25:52 crc kubenswrapper[5200]: I1126 15:25:52.047351 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-k77xb"] Nov 26 15:25:52 crc kubenswrapper[5200]: I1126 15:25:52.061925 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-k77xb"] Nov 26 15:25:52 crc kubenswrapper[5200]: I1126 15:25:52.982399 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c278290f-ff8f-407d-bdd9-78cbb944ad57" path="/var/lib/kubelet/pods/c278290f-ff8f-407d-bdd9-78cbb944ad57/volumes" Nov 26 15:26:03 crc kubenswrapper[5200]: I1126 15:26:03.154604 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:26:03 crc kubenswrapper[5200]: I1126 15:26:03.155270 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.154655 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.155397 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.155476 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.156461 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"060c735b3c6b91a7b59f706f13708c0f59f974db6c9c74cdcd8ef34b9041e93c"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.156534 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://060c735b3c6b91a7b59f706f13708c0f59f974db6c9c74cdcd8ef34b9041e93c" gracePeriod=600 Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.300043 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.414112 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="060c735b3c6b91a7b59f706f13708c0f59f974db6c9c74cdcd8ef34b9041e93c" exitCode=0 Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.414165 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"060c735b3c6b91a7b59f706f13708c0f59f974db6c9c74cdcd8ef34b9041e93c"} Nov 26 15:26:33 crc kubenswrapper[5200]: I1126 15:26:33.414206 5200 scope.go:117] "RemoveContainer" containerID="b147407bdd86f79df7cf08d3d856c8c21fbb480291c2af4f98d071ac5fe75525" Nov 26 15:26:34 crc kubenswrapper[5200]: I1126 15:26:34.368530 5200 scope.go:117] "RemoveContainer" containerID="00269e0f8dcf50852e327a81bf08b1abb3a03d96ec4f11c9e43e43a4c9bfe82b" Nov 26 15:26:34 crc kubenswrapper[5200]: I1126 15:26:34.415966 5200 scope.go:117] "RemoveContainer" containerID="aa64e4429244f42b2661ba7239b3bce128afeec7c704aaab31660f819f2468e2" Nov 26 15:26:34 crc kubenswrapper[5200]: I1126 15:26:34.428920 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f"} Nov 26 15:26:34 crc kubenswrapper[5200]: I1126 15:26:34.497101 5200 scope.go:117] "RemoveContainer" containerID="3dc00532f78d02d4ee53b5fab266967b08754b4d9b71efa4bd32bcef72847ce5" Nov 26 15:26:42 crc kubenswrapper[5200]: E1126 15:26:42.754544 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441991 actualBytes=10240 Nov 26 15:27:42 crc kubenswrapper[5200]: E1126 15:27:42.499942 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441981 actualBytes=10240 Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.642935 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8pg6c"] Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645493 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5bec326-8370-4872-9a81-a9aee249006c" containerName="registry-server" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645523 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bec326-8370-4872-9a81-a9aee249006c" containerName="registry-server" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645568 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="extract-utilities" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645577 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="extract-utilities" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645597 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="extract-content" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645606 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="extract-content" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645619 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5bec326-8370-4872-9a81-a9aee249006c" containerName="extract-content" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645628 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bec326-8370-4872-9a81-a9aee249006c" containerName="extract-content" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645647 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5bec326-8370-4872-9a81-a9aee249006c" containerName="extract-utilities" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645655 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bec326-8370-4872-9a81-a9aee249006c" containerName="extract-utilities" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645677 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="registry-server" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645700 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="registry-server" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645910 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5bec326-8370-4872-9a81-a9aee249006c" containerName="registry-server" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.645935 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="20f47c26-ed9e-46c8-864d-185b272044f7" containerName="registry-server" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.657105 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.660965 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pg6c"] Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.772030 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46dqz\" (UniqueName: \"kubernetes.io/projected/97e720d8-1e2b-4309-8686-29dbaf794ae3-kube-api-access-46dqz\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.772213 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-catalog-content\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.772285 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-utilities\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.875070 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46dqz\" (UniqueName: \"kubernetes.io/projected/97e720d8-1e2b-4309-8686-29dbaf794ae3-kube-api-access-46dqz\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.875262 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-catalog-content\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.875381 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-utilities\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.876166 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-utilities\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.877034 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-catalog-content\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.904018 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46dqz\" (UniqueName: \"kubernetes.io/projected/97e720d8-1e2b-4309-8686-29dbaf794ae3-kube-api-access-46dqz\") pod \"certified-operators-8pg6c\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:57 crc kubenswrapper[5200]: I1126 15:27:57.995111 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:27:58 crc kubenswrapper[5200]: I1126 15:27:58.514010 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pg6c"] Nov 26 15:27:59 crc kubenswrapper[5200]: I1126 15:27:59.307245 5200 generic.go:358] "Generic (PLEG): container finished" podID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerID="aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86" exitCode=0 Nov 26 15:27:59 crc kubenswrapper[5200]: I1126 15:27:59.307346 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pg6c" event={"ID":"97e720d8-1e2b-4309-8686-29dbaf794ae3","Type":"ContainerDied","Data":"aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86"} Nov 26 15:27:59 crc kubenswrapper[5200]: I1126 15:27:59.309842 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pg6c" event={"ID":"97e720d8-1e2b-4309-8686-29dbaf794ae3","Type":"ContainerStarted","Data":"23e6fee3485499b62ab097ef79f8f1c7a736bb52b4880ff08b0a1fb84f94a960"} Nov 26 15:28:01 crc kubenswrapper[5200]: I1126 15:28:01.333946 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pg6c" event={"ID":"97e720d8-1e2b-4309-8686-29dbaf794ae3","Type":"ContainerStarted","Data":"ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f"} Nov 26 15:28:02 crc kubenswrapper[5200]: I1126 15:28:02.348654 5200 generic.go:358] "Generic (PLEG): container finished" podID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerID="ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f" exitCode=0 Nov 26 15:28:02 crc kubenswrapper[5200]: I1126 15:28:02.348769 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pg6c" event={"ID":"97e720d8-1e2b-4309-8686-29dbaf794ae3","Type":"ContainerDied","Data":"ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f"} Nov 26 15:28:03 crc kubenswrapper[5200]: I1126 15:28:03.364468 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pg6c" event={"ID":"97e720d8-1e2b-4309-8686-29dbaf794ae3","Type":"ContainerStarted","Data":"32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be"} Nov 26 15:28:03 crc kubenswrapper[5200]: I1126 15:28:03.394050 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8pg6c" podStartSLOduration=5.179210419 podStartE2EDuration="6.394015462s" podCreationTimestamp="2025-11-26 15:27:57 +0000 UTC" firstStartedPulling="2025-11-26 15:27:59.308492536 +0000 UTC m=+7047.987694354" lastFinishedPulling="2025-11-26 15:28:00.523297579 +0000 UTC m=+7049.202499397" observedRunningTime="2025-11-26 15:28:03.386445642 +0000 UTC m=+7052.065647520" watchObservedRunningTime="2025-11-26 15:28:03.394015462 +0000 UTC m=+7052.073217320" Nov 26 15:28:07 crc kubenswrapper[5200]: I1126 15:28:07.997324 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:28:08 crc kubenswrapper[5200]: I1126 15:28:08.000056 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:28:08 crc kubenswrapper[5200]: I1126 15:28:08.089869 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:28:08 crc kubenswrapper[5200]: I1126 15:28:08.456528 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:28:08 crc kubenswrapper[5200]: I1126 15:28:08.511871 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pg6c"] Nov 26 15:28:10 crc kubenswrapper[5200]: I1126 15:28:10.432620 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8pg6c" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerName="registry-server" containerID="cri-o://32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be" gracePeriod=2 Nov 26 15:28:10 crc kubenswrapper[5200]: I1126 15:28:10.946108 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.044972 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-utilities\") pod \"97e720d8-1e2b-4309-8686-29dbaf794ae3\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.045191 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46dqz\" (UniqueName: \"kubernetes.io/projected/97e720d8-1e2b-4309-8686-29dbaf794ae3-kube-api-access-46dqz\") pod \"97e720d8-1e2b-4309-8686-29dbaf794ae3\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.045447 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-catalog-content\") pod \"97e720d8-1e2b-4309-8686-29dbaf794ae3\" (UID: \"97e720d8-1e2b-4309-8686-29dbaf794ae3\") " Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.046462 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-utilities" (OuterVolumeSpecName: "utilities") pod "97e720d8-1e2b-4309-8686-29dbaf794ae3" (UID: "97e720d8-1e2b-4309-8686-29dbaf794ae3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.056560 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e720d8-1e2b-4309-8686-29dbaf794ae3-kube-api-access-46dqz" (OuterVolumeSpecName: "kube-api-access-46dqz") pod "97e720d8-1e2b-4309-8686-29dbaf794ae3" (UID: "97e720d8-1e2b-4309-8686-29dbaf794ae3"). InnerVolumeSpecName "kube-api-access-46dqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.084064 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97e720d8-1e2b-4309-8686-29dbaf794ae3" (UID: "97e720d8-1e2b-4309-8686-29dbaf794ae3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.148350 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.148385 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e720d8-1e2b-4309-8686-29dbaf794ae3-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.148395 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-46dqz\" (UniqueName: \"kubernetes.io/projected/97e720d8-1e2b-4309-8686-29dbaf794ae3-kube-api-access-46dqz\") on node \"crc\" DevicePath \"\"" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.444883 5200 generic.go:358] "Generic (PLEG): container finished" podID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerID="32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be" exitCode=0 Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.445003 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pg6c" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.445026 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pg6c" event={"ID":"97e720d8-1e2b-4309-8686-29dbaf794ae3","Type":"ContainerDied","Data":"32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be"} Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.445085 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pg6c" event={"ID":"97e720d8-1e2b-4309-8686-29dbaf794ae3","Type":"ContainerDied","Data":"23e6fee3485499b62ab097ef79f8f1c7a736bb52b4880ff08b0a1fb84f94a960"} Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.445108 5200 scope.go:117] "RemoveContainer" containerID="32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.471873 5200 scope.go:117] "RemoveContainer" containerID="ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.494990 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pg6c"] Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.503270 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8pg6c"] Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.524994 5200 scope.go:117] "RemoveContainer" containerID="aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.556822 5200 scope.go:117] "RemoveContainer" containerID="32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be" Nov 26 15:28:11 crc kubenswrapper[5200]: E1126 15:28:11.557477 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be\": container with ID starting with 32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be not found: ID does not exist" containerID="32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.557509 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be"} err="failed to get container status \"32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be\": rpc error: code = NotFound desc = could not find container \"32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be\": container with ID starting with 32acc8c23ecf3a6b0b6ec18e4bffaa4daf6333dad80890cf06e819c664c170be not found: ID does not exist" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.557530 5200 scope.go:117] "RemoveContainer" containerID="ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f" Nov 26 15:28:11 crc kubenswrapper[5200]: E1126 15:28:11.557868 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f\": container with ID starting with ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f not found: ID does not exist" containerID="ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.557927 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f"} err="failed to get container status \"ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f\": rpc error: code = NotFound desc = could not find container \"ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f\": container with ID starting with ee3dc9882d4225333901e547f1061421a361087ada331cb7b3f0c3a10393de9f not found: ID does not exist" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.557956 5200 scope.go:117] "RemoveContainer" containerID="aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86" Nov 26 15:28:11 crc kubenswrapper[5200]: E1126 15:28:11.558240 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86\": container with ID starting with aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86 not found: ID does not exist" containerID="aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86" Nov 26 15:28:11 crc kubenswrapper[5200]: I1126 15:28:11.558268 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86"} err="failed to get container status \"aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86\": rpc error: code = NotFound desc = could not find container \"aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86\": container with ID starting with aad8286a86cc43b5a2c8e4b500666803fbb82288a9fbb6300ba17048514ffa86 not found: ID does not exist" Nov 26 15:28:13 crc kubenswrapper[5200]: I1126 15:28:13.167775 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" path="/var/lib/kubelet/pods/97e720d8-1e2b-4309-8686-29dbaf794ae3/volumes" Nov 26 15:28:33 crc kubenswrapper[5200]: I1126 15:28:33.154485 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:28:33 crc kubenswrapper[5200]: I1126 15:28:33.155048 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:28:42 crc kubenswrapper[5200]: E1126 15:28:42.456588 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441973 actualBytes=10240 Nov 26 15:29:03 crc kubenswrapper[5200]: I1126 15:29:03.154134 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:29:03 crc kubenswrapper[5200]: I1126 15:29:03.154662 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:29:29 crc kubenswrapper[5200]: I1126 15:29:29.965091 5200 generic.go:358] "Generic (PLEG): container finished" podID="180e251a-424a-4738-852d-1c2bb96044df" containerID="5f37e84cc3a2b1e4c561f3df8b591f90ffba1fd75b19cff2c8481917406d3121" exitCode=0 Nov 26 15:29:29 crc kubenswrapper[5200]: I1126 15:29:29.965177 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" event={"ID":"180e251a-424a-4738-852d-1c2bb96044df","Type":"ContainerDied","Data":"5f37e84cc3a2b1e4c561f3df8b591f90ffba1fd75b19cff2c8481917406d3121"} Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.434227 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.559521 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ssh-key\") pod \"180e251a-424a-4738-852d-1c2bb96044df\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.559626 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rbcm\" (UniqueName: \"kubernetes.io/projected/180e251a-424a-4738-852d-1c2bb96044df-kube-api-access-6rbcm\") pod \"180e251a-424a-4738-852d-1c2bb96044df\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.559809 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-tripleo-cleanup-combined-ca-bundle\") pod \"180e251a-424a-4738-852d-1c2bb96044df\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.559829 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ceph\") pod \"180e251a-424a-4738-852d-1c2bb96044df\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.560039 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-inventory\") pod \"180e251a-424a-4738-852d-1c2bb96044df\" (UID: \"180e251a-424a-4738-852d-1c2bb96044df\") " Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.565698 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "180e251a-424a-4738-852d-1c2bb96044df" (UID: "180e251a-424a-4738-852d-1c2bb96044df"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.566164 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180e251a-424a-4738-852d-1c2bb96044df-kube-api-access-6rbcm" (OuterVolumeSpecName: "kube-api-access-6rbcm") pod "180e251a-424a-4738-852d-1c2bb96044df" (UID: "180e251a-424a-4738-852d-1c2bb96044df"). InnerVolumeSpecName "kube-api-access-6rbcm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.572197 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ceph" (OuterVolumeSpecName: "ceph") pod "180e251a-424a-4738-852d-1c2bb96044df" (UID: "180e251a-424a-4738-852d-1c2bb96044df"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.590594 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "180e251a-424a-4738-852d-1c2bb96044df" (UID: "180e251a-424a-4738-852d-1c2bb96044df"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.591321 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-inventory" (OuterVolumeSpecName: "inventory") pod "180e251a-424a-4738-852d-1c2bb96044df" (UID: "180e251a-424a-4738-852d-1c2bb96044df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.662832 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.663469 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.663581 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rbcm\" (UniqueName: \"kubernetes.io/projected/180e251a-424a-4738-852d-1c2bb96044df-kube-api-access-6rbcm\") on node \"crc\" DevicePath \"\"" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.663706 5200 reconciler_common.go:299] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.663845 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/180e251a-424a-4738-852d-1c2bb96044df-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.987375 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.987443 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv" event={"ID":"180e251a-424a-4738-852d-1c2bb96044df","Type":"ContainerDied","Data":"332f2df63ccc91bd987bd9522e3b417253cc54739ce6162cc6f824b83d0ddb45"} Nov 26 15:29:31 crc kubenswrapper[5200]: I1126 15:29:31.987758 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="332f2df63ccc91bd987bd9522e3b417253cc54739ce6162cc6f824b83d0ddb45" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.119535 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rg4cg"] Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121364 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="180e251a-424a-4738-852d-1c2bb96044df" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121387 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="180e251a-424a-4738-852d-1c2bb96044df" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121416 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerName="extract-content" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121422 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerName="extract-content" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121446 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerName="registry-server" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121451 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerName="registry-server" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121500 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerName="extract-utilities" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121507 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerName="extract-utilities" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121729 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="180e251a-424a-4738-852d-1c2bb96044df" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.121757 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="97e720d8-1e2b-4309-8686-29dbaf794ae3" containerName="registry-server" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.126430 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.129525 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.129603 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rg4cg"] Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.129858 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.129863 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.131730 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.154087 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.154164 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.154214 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.155011 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.155090 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" gracePeriod=600 Nov 26 15:29:33 crc kubenswrapper[5200]: E1126 15:29:33.299334 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.306626 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ceph\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.306815 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg9c7\" (UniqueName: \"kubernetes.io/projected/30e6b5f6-86f5-422f-92c2-f862bb5ef259-kube-api-access-tg9c7\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.306845 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.306878 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.307042 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-inventory\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.409161 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-inventory\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.409313 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ceph\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.409364 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg9c7\" (UniqueName: \"kubernetes.io/projected/30e6b5f6-86f5-422f-92c2-f862bb5ef259-kube-api-access-tg9c7\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.409404 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.409451 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.414672 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.414991 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-inventory\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.414993 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.421313 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ceph\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.428142 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg9c7\" (UniqueName: \"kubernetes.io/projected/30e6b5f6-86f5-422f-92c2-f862bb5ef259-kube-api-access-tg9c7\") pod \"bootstrap-openstack-openstack-cell1-rg4cg\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.460443 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:29:33 crc kubenswrapper[5200]: I1126 15:29:33.993439 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rg4cg"] Nov 26 15:29:33 crc kubenswrapper[5200]: W1126 15:29:33.998874 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e6b5f6_86f5_422f_92c2_f862bb5ef259.slice/crio-fdb217a860aea0dfcd58f45406039ebc727e0e9599056f4b1f3a669864c05eb7 WatchSource:0}: Error finding container fdb217a860aea0dfcd58f45406039ebc727e0e9599056f4b1f3a669864c05eb7: Status 404 returned error can't find the container with id fdb217a860aea0dfcd58f45406039ebc727e0e9599056f4b1f3a669864c05eb7 Nov 26 15:29:34 crc kubenswrapper[5200]: I1126 15:29:34.010057 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" exitCode=0 Nov 26 15:29:34 crc kubenswrapper[5200]: I1126 15:29:34.010146 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f"} Nov 26 15:29:34 crc kubenswrapper[5200]: I1126 15:29:34.010197 5200 scope.go:117] "RemoveContainer" containerID="060c735b3c6b91a7b59f706f13708c0f59f974db6c9c74cdcd8ef34b9041e93c" Nov 26 15:29:34 crc kubenswrapper[5200]: I1126 15:29:34.011098 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:29:34 crc kubenswrapper[5200]: E1126 15:29:34.011604 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:29:35 crc kubenswrapper[5200]: I1126 15:29:35.025006 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" event={"ID":"30e6b5f6-86f5-422f-92c2-f862bb5ef259","Type":"ContainerStarted","Data":"fdb217a860aea0dfcd58f45406039ebc727e0e9599056f4b1f3a669864c05eb7"} Nov 26 15:29:36 crc kubenswrapper[5200]: I1126 15:29:36.039766 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" event={"ID":"30e6b5f6-86f5-422f-92c2-f862bb5ef259","Type":"ContainerStarted","Data":"7fc2a2a6240aaddf3de082e2bdcba8bf894e9ef800f9dc677f1a4650baf2b598"} Nov 26 15:29:36 crc kubenswrapper[5200]: I1126 15:29:36.064625 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" podStartSLOduration=2.287381861 podStartE2EDuration="3.064606176s" podCreationTimestamp="2025-11-26 15:29:33 +0000 UTC" firstStartedPulling="2025-11-26 15:29:34.002660012 +0000 UTC m=+7142.681861830" lastFinishedPulling="2025-11-26 15:29:34.779884327 +0000 UTC m=+7143.459086145" observedRunningTime="2025-11-26 15:29:36.058562857 +0000 UTC m=+7144.737764675" watchObservedRunningTime="2025-11-26 15:29:36.064606176 +0000 UTC m=+7144.743807994" Nov 26 15:29:42 crc kubenswrapper[5200]: E1126 15:29:42.469943 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443179 actualBytes=10240 Nov 26 15:29:47 crc kubenswrapper[5200]: I1126 15:29:47.969351 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:29:47 crc kubenswrapper[5200]: E1126 15:29:47.970428 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.146106 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m"] Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.153937 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.156131 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.156407 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.157716 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m"] Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.226439 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cce5b96-9f16-45b4-be33-b50774e13c38-secret-volume\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.226876 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssj5t\" (UniqueName: \"kubernetes.io/projected/8cce5b96-9f16-45b4-be33-b50774e13c38-kube-api-access-ssj5t\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.227204 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cce5b96-9f16-45b4-be33-b50774e13c38-config-volume\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.329805 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssj5t\" (UniqueName: \"kubernetes.io/projected/8cce5b96-9f16-45b4-be33-b50774e13c38-kube-api-access-ssj5t\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.329903 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cce5b96-9f16-45b4-be33-b50774e13c38-config-volume\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.330060 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cce5b96-9f16-45b4-be33-b50774e13c38-secret-volume\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.330715 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cce5b96-9f16-45b4-be33-b50774e13c38-config-volume\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.336181 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cce5b96-9f16-45b4-be33-b50774e13c38-secret-volume\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.347614 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssj5t\" (UniqueName: \"kubernetes.io/projected/8cce5b96-9f16-45b4-be33-b50774e13c38-kube-api-access-ssj5t\") pod \"collect-profiles-29402850-m2p8m\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.520980 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:00 crc kubenswrapper[5200]: I1126 15:30:00.964719 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m"] Nov 26 15:30:01 crc kubenswrapper[5200]: I1126 15:30:01.292377 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" event={"ID":"8cce5b96-9f16-45b4-be33-b50774e13c38","Type":"ContainerStarted","Data":"7aa4b6f9112e979cb51369f10095e85ab9f6f56c05c02d447d406b9edc51b428"} Nov 26 15:30:01 crc kubenswrapper[5200]: I1126 15:30:01.292435 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" event={"ID":"8cce5b96-9f16-45b4-be33-b50774e13c38","Type":"ContainerStarted","Data":"5fe1b17e539641713b0baa418dd8b15976630fcd090bccf245aa5131d70f3336"} Nov 26 15:30:01 crc kubenswrapper[5200]: I1126 15:30:01.313318 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" podStartSLOduration=1.313297004 podStartE2EDuration="1.313297004s" podCreationTimestamp="2025-11-26 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:30:01.306320819 +0000 UTC m=+7169.985522627" watchObservedRunningTime="2025-11-26 15:30:01.313297004 +0000 UTC m=+7169.992498822" Nov 26 15:30:02 crc kubenswrapper[5200]: I1126 15:30:02.306250 5200 generic.go:358] "Generic (PLEG): container finished" podID="8cce5b96-9f16-45b4-be33-b50774e13c38" containerID="7aa4b6f9112e979cb51369f10095e85ab9f6f56c05c02d447d406b9edc51b428" exitCode=0 Nov 26 15:30:02 crc kubenswrapper[5200]: I1126 15:30:02.306407 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" event={"ID":"8cce5b96-9f16-45b4-be33-b50774e13c38","Type":"ContainerDied","Data":"7aa4b6f9112e979cb51369f10095e85ab9f6f56c05c02d447d406b9edc51b428"} Nov 26 15:30:02 crc kubenswrapper[5200]: I1126 15:30:02.981280 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:30:02 crc kubenswrapper[5200]: E1126 15:30:02.981816 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.716392 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.825522 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cce5b96-9f16-45b4-be33-b50774e13c38-secret-volume\") pod \"8cce5b96-9f16-45b4-be33-b50774e13c38\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.825647 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssj5t\" (UniqueName: \"kubernetes.io/projected/8cce5b96-9f16-45b4-be33-b50774e13c38-kube-api-access-ssj5t\") pod \"8cce5b96-9f16-45b4-be33-b50774e13c38\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.825672 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cce5b96-9f16-45b4-be33-b50774e13c38-config-volume\") pod \"8cce5b96-9f16-45b4-be33-b50774e13c38\" (UID: \"8cce5b96-9f16-45b4-be33-b50774e13c38\") " Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.826401 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cce5b96-9f16-45b4-be33-b50774e13c38-config-volume" (OuterVolumeSpecName: "config-volume") pod "8cce5b96-9f16-45b4-be33-b50774e13c38" (UID: "8cce5b96-9f16-45b4-be33-b50774e13c38"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.826901 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cce5b96-9f16-45b4-be33-b50774e13c38-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.831973 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cce5b96-9f16-45b4-be33-b50774e13c38-kube-api-access-ssj5t" (OuterVolumeSpecName: "kube-api-access-ssj5t") pod "8cce5b96-9f16-45b4-be33-b50774e13c38" (UID: "8cce5b96-9f16-45b4-be33-b50774e13c38"). InnerVolumeSpecName "kube-api-access-ssj5t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.833960 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce5b96-9f16-45b4-be33-b50774e13c38-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8cce5b96-9f16-45b4-be33-b50774e13c38" (UID: "8cce5b96-9f16-45b4-be33-b50774e13c38"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.929485 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cce5b96-9f16-45b4-be33-b50774e13c38-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:03 crc kubenswrapper[5200]: I1126 15:30:03.929857 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ssj5t\" (UniqueName: \"kubernetes.io/projected/8cce5b96-9f16-45b4-be33-b50774e13c38-kube-api-access-ssj5t\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:04 crc kubenswrapper[5200]: I1126 15:30:04.329317 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" Nov 26 15:30:04 crc kubenswrapper[5200]: I1126 15:30:04.329363 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m" event={"ID":"8cce5b96-9f16-45b4-be33-b50774e13c38","Type":"ContainerDied","Data":"5fe1b17e539641713b0baa418dd8b15976630fcd090bccf245aa5131d70f3336"} Nov 26 15:30:04 crc kubenswrapper[5200]: I1126 15:30:04.329433 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fe1b17e539641713b0baa418dd8b15976630fcd090bccf245aa5131d70f3336" Nov 26 15:30:04 crc kubenswrapper[5200]: I1126 15:30:04.385890 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp"] Nov 26 15:30:04 crc kubenswrapper[5200]: I1126 15:30:04.394983 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402805-7hsqp"] Nov 26 15:30:04 crc kubenswrapper[5200]: I1126 15:30:04.981312 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9783529c-b45f-464e-b451-ef77c2d0f41f" path="/var/lib/kubelet/pods/9783529c-b45f-464e-b451-ef77c2d0f41f/volumes" Nov 26 15:30:15 crc kubenswrapper[5200]: I1126 15:30:15.969395 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:30:15 crc kubenswrapper[5200]: E1126 15:30:15.970146 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:30:27 crc kubenswrapper[5200]: I1126 15:30:27.970264 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:30:27 crc kubenswrapper[5200]: E1126 15:30:27.971224 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:30:34 crc kubenswrapper[5200]: I1126 15:30:34.667845 5200 scope.go:117] "RemoveContainer" containerID="0dd57eb0ded6b4a153a2992cee09497d1e91a43a64584d54fda46ffdcb753f1c" Nov 26 15:30:38 crc kubenswrapper[5200]: I1126 15:30:38.173202 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:30:38 crc kubenswrapper[5200]: I1126 15:30:38.178252 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:30:38 crc kubenswrapper[5200]: I1126 15:30:38.178397 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:30:38 crc kubenswrapper[5200]: I1126 15:30:38.183021 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:30:41 crc kubenswrapper[5200]: I1126 15:30:41.970750 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:30:41 crc kubenswrapper[5200]: E1126 15:30:41.972162 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:30:42 crc kubenswrapper[5200]: E1126 15:30:42.608365 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443171 actualBytes=10240 Nov 26 15:30:55 crc kubenswrapper[5200]: I1126 15:30:55.969603 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:30:55 crc kubenswrapper[5200]: E1126 15:30:55.970807 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:31:06 crc kubenswrapper[5200]: I1126 15:31:06.969965 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:31:06 crc kubenswrapper[5200]: E1126 15:31:06.970776 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:31:19 crc kubenswrapper[5200]: I1126 15:31:19.970537 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:31:19 crc kubenswrapper[5200]: E1126 15:31:19.971315 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:31:32 crc kubenswrapper[5200]: I1126 15:31:32.978656 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:31:32 crc kubenswrapper[5200]: E1126 15:31:32.979394 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:31:42 crc kubenswrapper[5200]: E1126 15:31:42.589106 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443173 actualBytes=10240 Nov 26 15:31:43 crc kubenswrapper[5200]: I1126 15:31:43.969856 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:31:43 crc kubenswrapper[5200]: E1126 15:31:43.970369 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:31:55 crc kubenswrapper[5200]: I1126 15:31:55.969355 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:31:55 crc kubenswrapper[5200]: E1126 15:31:55.970276 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.394125 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ldxq7"] Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.395900 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cce5b96-9f16-45b4-be33-b50774e13c38" containerName="collect-profiles" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.395915 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cce5b96-9f16-45b4-be33-b50774e13c38" containerName="collect-profiles" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.396097 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cce5b96-9f16-45b4-be33-b50774e13c38" containerName="collect-profiles" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.402584 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.415156 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ldxq7"] Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.513412 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-catalog-content\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.513920 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm299\" (UniqueName: \"kubernetes.io/projected/ef2b8827-db0c-4c18-a81d-34939d1fa13b-kube-api-access-pm299\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.514223 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-utilities\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.617589 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-catalog-content\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.617877 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm299\" (UniqueName: \"kubernetes.io/projected/ef2b8827-db0c-4c18-a81d-34939d1fa13b-kube-api-access-pm299\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.618090 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-utilities\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.618101 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-catalog-content\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.618643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-utilities\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.649491 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm299\" (UniqueName: \"kubernetes.io/projected/ef2b8827-db0c-4c18-a81d-34939d1fa13b-kube-api-access-pm299\") pod \"community-operators-ldxq7\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:00 crc kubenswrapper[5200]: I1126 15:32:00.753569 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:01 crc kubenswrapper[5200]: I1126 15:32:01.249551 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ldxq7"] Nov 26 15:32:01 crc kubenswrapper[5200]: I1126 15:32:01.266634 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:32:01 crc kubenswrapper[5200]: I1126 15:32:01.599436 5200 generic.go:358] "Generic (PLEG): container finished" podID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerID="2e07b9d96f0ab11d3042199b7a75f309a1e2e336683c3481cab743400c29f028" exitCode=0 Nov 26 15:32:01 crc kubenswrapper[5200]: I1126 15:32:01.599522 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldxq7" event={"ID":"ef2b8827-db0c-4c18-a81d-34939d1fa13b","Type":"ContainerDied","Data":"2e07b9d96f0ab11d3042199b7a75f309a1e2e336683c3481cab743400c29f028"} Nov 26 15:32:01 crc kubenswrapper[5200]: I1126 15:32:01.600269 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldxq7" event={"ID":"ef2b8827-db0c-4c18-a81d-34939d1fa13b","Type":"ContainerStarted","Data":"ec436b79780e42cc2907202385a5f5d3c3de4b157d5d08a05d5b5b3d11c8adf8"} Nov 26 15:32:03 crc kubenswrapper[5200]: I1126 15:32:03.623113 5200 generic.go:358] "Generic (PLEG): container finished" podID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerID="83466349c96e05f74cfbc513e84d215d255dc4b7fc179a035877e8d43f342db7" exitCode=0 Nov 26 15:32:03 crc kubenswrapper[5200]: I1126 15:32:03.623195 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldxq7" event={"ID":"ef2b8827-db0c-4c18-a81d-34939d1fa13b","Type":"ContainerDied","Data":"83466349c96e05f74cfbc513e84d215d255dc4b7fc179a035877e8d43f342db7"} Nov 26 15:32:04 crc kubenswrapper[5200]: I1126 15:32:04.636080 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldxq7" event={"ID":"ef2b8827-db0c-4c18-a81d-34939d1fa13b","Type":"ContainerStarted","Data":"078b7900b0fca0d7cd61e05b819d1ddaa94e0a7f45ef09ef75784890f8d5ea01"} Nov 26 15:32:04 crc kubenswrapper[5200]: I1126 15:32:04.661039 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ldxq7" podStartSLOduration=3.546912758 podStartE2EDuration="4.661021938s" podCreationTimestamp="2025-11-26 15:32:00 +0000 UTC" firstStartedPulling="2025-11-26 15:32:01.60062281 +0000 UTC m=+7290.279824628" lastFinishedPulling="2025-11-26 15:32:02.71473199 +0000 UTC m=+7291.393933808" observedRunningTime="2025-11-26 15:32:04.652478193 +0000 UTC m=+7293.331680031" watchObservedRunningTime="2025-11-26 15:32:04.661021938 +0000 UTC m=+7293.340223756" Nov 26 15:32:10 crc kubenswrapper[5200]: I1126 15:32:10.754917 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:10 crc kubenswrapper[5200]: I1126 15:32:10.755807 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:10 crc kubenswrapper[5200]: I1126 15:32:10.809231 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:10 crc kubenswrapper[5200]: I1126 15:32:10.969380 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:32:10 crc kubenswrapper[5200]: E1126 15:32:10.969797 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:32:11 crc kubenswrapper[5200]: I1126 15:32:11.777354 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:11 crc kubenswrapper[5200]: I1126 15:32:11.829392 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ldxq7"] Nov 26 15:32:13 crc kubenswrapper[5200]: I1126 15:32:13.931625 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ldxq7" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerName="registry-server" containerID="cri-o://078b7900b0fca0d7cd61e05b819d1ddaa94e0a7f45ef09ef75784890f8d5ea01" gracePeriod=2 Nov 26 15:32:14 crc kubenswrapper[5200]: I1126 15:32:14.955510 5200 generic.go:358] "Generic (PLEG): container finished" podID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerID="078b7900b0fca0d7cd61e05b819d1ddaa94e0a7f45ef09ef75784890f8d5ea01" exitCode=0 Nov 26 15:32:14 crc kubenswrapper[5200]: I1126 15:32:14.955614 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldxq7" event={"ID":"ef2b8827-db0c-4c18-a81d-34939d1fa13b","Type":"ContainerDied","Data":"078b7900b0fca0d7cd61e05b819d1ddaa94e0a7f45ef09ef75784890f8d5ea01"} Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.055366 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.167427 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm299\" (UniqueName: \"kubernetes.io/projected/ef2b8827-db0c-4c18-a81d-34939d1fa13b-kube-api-access-pm299\") pod \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.167588 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-catalog-content\") pod \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.168044 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-utilities\") pod \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\" (UID: \"ef2b8827-db0c-4c18-a81d-34939d1fa13b\") " Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.168855 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-utilities" (OuterVolumeSpecName: "utilities") pod "ef2b8827-db0c-4c18-a81d-34939d1fa13b" (UID: "ef2b8827-db0c-4c18-a81d-34939d1fa13b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.179960 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef2b8827-db0c-4c18-a81d-34939d1fa13b-kube-api-access-pm299" (OuterVolumeSpecName: "kube-api-access-pm299") pod "ef2b8827-db0c-4c18-a81d-34939d1fa13b" (UID: "ef2b8827-db0c-4c18-a81d-34939d1fa13b"). InnerVolumeSpecName "kube-api-access-pm299". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.207238 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef2b8827-db0c-4c18-a81d-34939d1fa13b" (UID: "ef2b8827-db0c-4c18-a81d-34939d1fa13b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.270216 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pm299\" (UniqueName: \"kubernetes.io/projected/ef2b8827-db0c-4c18-a81d-34939d1fa13b-kube-api-access-pm299\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.270253 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.270264 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef2b8827-db0c-4c18-a81d-34939d1fa13b-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.972843 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldxq7" Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.972853 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldxq7" event={"ID":"ef2b8827-db0c-4c18-a81d-34939d1fa13b","Type":"ContainerDied","Data":"ec436b79780e42cc2907202385a5f5d3c3de4b157d5d08a05d5b5b3d11c8adf8"} Nov 26 15:32:15 crc kubenswrapper[5200]: I1126 15:32:15.973276 5200 scope.go:117] "RemoveContainer" containerID="078b7900b0fca0d7cd61e05b819d1ddaa94e0a7f45ef09ef75784890f8d5ea01" Nov 26 15:32:16 crc kubenswrapper[5200]: I1126 15:32:16.003597 5200 scope.go:117] "RemoveContainer" containerID="83466349c96e05f74cfbc513e84d215d255dc4b7fc179a035877e8d43f342db7" Nov 26 15:32:16 crc kubenswrapper[5200]: I1126 15:32:16.021486 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ldxq7"] Nov 26 15:32:16 crc kubenswrapper[5200]: I1126 15:32:16.030236 5200 scope.go:117] "RemoveContainer" containerID="2e07b9d96f0ab11d3042199b7a75f309a1e2e336683c3481cab743400c29f028" Nov 26 15:32:16 crc kubenswrapper[5200]: I1126 15:32:16.031905 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ldxq7"] Nov 26 15:32:16 crc kubenswrapper[5200]: I1126 15:32:16.981701 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" path="/var/lib/kubelet/pods/ef2b8827-db0c-4c18-a81d-34939d1fa13b/volumes" Nov 26 15:32:21 crc kubenswrapper[5200]: I1126 15:32:21.970644 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:32:21 crc kubenswrapper[5200]: E1126 15:32:21.972386 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:32:33 crc kubenswrapper[5200]: I1126 15:32:33.970045 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:32:33 crc kubenswrapper[5200]: E1126 15:32:33.972535 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:32:41 crc kubenswrapper[5200]: I1126 15:32:41.246612 5200 generic.go:358] "Generic (PLEG): container finished" podID="30e6b5f6-86f5-422f-92c2-f862bb5ef259" containerID="7fc2a2a6240aaddf3de082e2bdcba8bf894e9ef800f9dc677f1a4650baf2b598" exitCode=0 Nov 26 15:32:41 crc kubenswrapper[5200]: I1126 15:32:41.246681 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" event={"ID":"30e6b5f6-86f5-422f-92c2-f862bb5ef259","Type":"ContainerDied","Data":"7fc2a2a6240aaddf3de082e2bdcba8bf894e9ef800f9dc677f1a4650baf2b598"} Nov 26 15:32:42 crc kubenswrapper[5200]: E1126 15:32:42.630215 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441723 actualBytes=10240 Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.731468 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.836944 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ceph\") pod \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.837328 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg9c7\" (UniqueName: \"kubernetes.io/projected/30e6b5f6-86f5-422f-92c2-f862bb5ef259-kube-api-access-tg9c7\") pod \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.837399 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ssh-key\") pod \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.837474 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-inventory\") pod \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.837505 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-bootstrap-combined-ca-bundle\") pod \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\" (UID: \"30e6b5f6-86f5-422f-92c2-f862bb5ef259\") " Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.843425 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ceph" (OuterVolumeSpecName: "ceph") pod "30e6b5f6-86f5-422f-92c2-f862bb5ef259" (UID: "30e6b5f6-86f5-422f-92c2-f862bb5ef259"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.844019 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e6b5f6-86f5-422f-92c2-f862bb5ef259-kube-api-access-tg9c7" (OuterVolumeSpecName: "kube-api-access-tg9c7") pod "30e6b5f6-86f5-422f-92c2-f862bb5ef259" (UID: "30e6b5f6-86f5-422f-92c2-f862bb5ef259"). InnerVolumeSpecName "kube-api-access-tg9c7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.844070 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "30e6b5f6-86f5-422f-92c2-f862bb5ef259" (UID: "30e6b5f6-86f5-422f-92c2-f862bb5ef259"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.870313 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-inventory" (OuterVolumeSpecName: "inventory") pod "30e6b5f6-86f5-422f-92c2-f862bb5ef259" (UID: "30e6b5f6-86f5-422f-92c2-f862bb5ef259"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.873347 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "30e6b5f6-86f5-422f-92c2-f862bb5ef259" (UID: "30e6b5f6-86f5-422f-92c2-f862bb5ef259"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.940897 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tg9c7\" (UniqueName: \"kubernetes.io/projected/30e6b5f6-86f5-422f-92c2-f862bb5ef259-kube-api-access-tg9c7\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.940941 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.940952 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.940964 5200 reconciler_common.go:299] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:42 crc kubenswrapper[5200]: I1126 15:32:42.940975 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30e6b5f6-86f5-422f-92c2-f862bb5ef259-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.273229 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.273317 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rg4cg" event={"ID":"30e6b5f6-86f5-422f-92c2-f862bb5ef259","Type":"ContainerDied","Data":"fdb217a860aea0dfcd58f45406039ebc727e0e9599056f4b1f3a669864c05eb7"} Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.273530 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdb217a860aea0dfcd58f45406039ebc727e0e9599056f4b1f3a669864c05eb7" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.357179 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-shd7t"] Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.382137 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerName="extract-content" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.382187 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerName="extract-content" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.382283 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerName="registry-server" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.382290 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerName="registry-server" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.382312 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30e6b5f6-86f5-422f-92c2-f862bb5ef259" containerName="bootstrap-openstack-openstack-cell1" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.382319 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e6b5f6-86f5-422f-92c2-f862bb5ef259" containerName="bootstrap-openstack-openstack-cell1" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.382427 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerName="extract-utilities" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.382434 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerName="extract-utilities" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.383289 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef2b8827-db0c-4c18-a81d-34939d1fa13b" containerName="registry-server" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.383321 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="30e6b5f6-86f5-422f-92c2-f862bb5ef259" containerName="bootstrap-openstack-openstack-cell1" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.563765 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-shd7t"] Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.563943 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.566225 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.566284 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.566668 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.566912 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.659969 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ceph\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.660094 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-inventory\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.660556 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ssh-key\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.660709 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn59g\" (UniqueName: \"kubernetes.io/projected/472d7d34-e34d-4b58-9c6a-f290dd6561f0-kube-api-access-jn59g\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.763910 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ceph\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.764050 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-inventory\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.764259 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ssh-key\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.764324 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn59g\" (UniqueName: \"kubernetes.io/projected/472d7d34-e34d-4b58-9c6a-f290dd6561f0-kube-api-access-jn59g\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.772459 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ssh-key\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.772535 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-inventory\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.772783 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ceph\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.782437 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn59g\" (UniqueName: \"kubernetes.io/projected/472d7d34-e34d-4b58-9c6a-f290dd6561f0-kube-api-access-jn59g\") pod \"download-cache-openstack-openstack-cell1-shd7t\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:43 crc kubenswrapper[5200]: I1126 15:32:43.883277 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:32:44 crc kubenswrapper[5200]: I1126 15:32:44.435251 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-shd7t"] Nov 26 15:32:45 crc kubenswrapper[5200]: I1126 15:32:45.290300 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" event={"ID":"472d7d34-e34d-4b58-9c6a-f290dd6561f0","Type":"ContainerStarted","Data":"9a9bc1739499974481d62e975631be46ed1d798a42bf7e7b306421a1e5ca4aeb"} Nov 26 15:32:46 crc kubenswrapper[5200]: I1126 15:32:46.305374 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" event={"ID":"472d7d34-e34d-4b58-9c6a-f290dd6561f0","Type":"ContainerStarted","Data":"b3d39ca5f44e5f8b15eb2a61f0d969ab29ad82c519221f4f90597f6aa0e53461"} Nov 26 15:32:46 crc kubenswrapper[5200]: I1126 15:32:46.334128 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" podStartSLOduration=2.548129951 podStartE2EDuration="3.334110568s" podCreationTimestamp="2025-11-26 15:32:43 +0000 UTC" firstStartedPulling="2025-11-26 15:32:44.442870204 +0000 UTC m=+7333.122072022" lastFinishedPulling="2025-11-26 15:32:45.228850821 +0000 UTC m=+7333.908052639" observedRunningTime="2025-11-26 15:32:46.329152795 +0000 UTC m=+7335.008354613" watchObservedRunningTime="2025-11-26 15:32:46.334110568 +0000 UTC m=+7335.013312386" Nov 26 15:32:46 crc kubenswrapper[5200]: I1126 15:32:46.969109 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:32:46 crc kubenswrapper[5200]: E1126 15:32:46.969635 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:33:01 crc kubenswrapper[5200]: I1126 15:33:01.969793 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:33:01 crc kubenswrapper[5200]: E1126 15:33:01.970644 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:33:16 crc kubenswrapper[5200]: I1126 15:33:16.969791 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:33:16 crc kubenswrapper[5200]: E1126 15:33:16.970606 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:33:28 crc kubenswrapper[5200]: I1126 15:33:28.970183 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:33:28 crc kubenswrapper[5200]: E1126 15:33:28.972080 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:33:40 crc kubenswrapper[5200]: I1126 15:33:40.970992 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:33:40 crc kubenswrapper[5200]: E1126 15:33:40.971819 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:33:42 crc kubenswrapper[5200]: E1126 15:33:42.518435 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441883 actualBytes=10240 Nov 26 15:33:55 crc kubenswrapper[5200]: I1126 15:33:55.970999 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:33:55 crc kubenswrapper[5200]: E1126 15:33:55.972859 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:34:10 crc kubenswrapper[5200]: I1126 15:34:10.971760 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:34:10 crc kubenswrapper[5200]: E1126 15:34:10.975291 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:34:19 crc kubenswrapper[5200]: I1126 15:34:19.232674 5200 generic.go:358] "Generic (PLEG): container finished" podID="472d7d34-e34d-4b58-9c6a-f290dd6561f0" containerID="b3d39ca5f44e5f8b15eb2a61f0d969ab29ad82c519221f4f90597f6aa0e53461" exitCode=0 Nov 26 15:34:19 crc kubenswrapper[5200]: I1126 15:34:19.232820 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" event={"ID":"472d7d34-e34d-4b58-9c6a-f290dd6561f0","Type":"ContainerDied","Data":"b3d39ca5f44e5f8b15eb2a61f0d969ab29ad82c519221f4f90597f6aa0e53461"} Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.723719 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.814805 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn59g\" (UniqueName: \"kubernetes.io/projected/472d7d34-e34d-4b58-9c6a-f290dd6561f0-kube-api-access-jn59g\") pod \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.815047 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ceph\") pod \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.815168 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-inventory\") pod \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.815467 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ssh-key\") pod \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\" (UID: \"472d7d34-e34d-4b58-9c6a-f290dd6561f0\") " Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.821055 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ceph" (OuterVolumeSpecName: "ceph") pod "472d7d34-e34d-4b58-9c6a-f290dd6561f0" (UID: "472d7d34-e34d-4b58-9c6a-f290dd6561f0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.821261 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472d7d34-e34d-4b58-9c6a-f290dd6561f0-kube-api-access-jn59g" (OuterVolumeSpecName: "kube-api-access-jn59g") pod "472d7d34-e34d-4b58-9c6a-f290dd6561f0" (UID: "472d7d34-e34d-4b58-9c6a-f290dd6561f0"). InnerVolumeSpecName "kube-api-access-jn59g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.847194 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-inventory" (OuterVolumeSpecName: "inventory") pod "472d7d34-e34d-4b58-9c6a-f290dd6561f0" (UID: "472d7d34-e34d-4b58-9c6a-f290dd6561f0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.847385 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "472d7d34-e34d-4b58-9c6a-f290dd6561f0" (UID: "472d7d34-e34d-4b58-9c6a-f290dd6561f0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.918536 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.918574 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.918586 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jn59g\" (UniqueName: \"kubernetes.io/projected/472d7d34-e34d-4b58-9c6a-f290dd6561f0-kube-api-access-jn59g\") on node \"crc\" DevicePath \"\"" Nov 26 15:34:20 crc kubenswrapper[5200]: I1126 15:34:20.918601 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472d7d34-e34d-4b58-9c6a-f290dd6561f0-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.258319 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" event={"ID":"472d7d34-e34d-4b58-9c6a-f290dd6561f0","Type":"ContainerDied","Data":"9a9bc1739499974481d62e975631be46ed1d798a42bf7e7b306421a1e5ca4aeb"} Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.258366 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a9bc1739499974481d62e975631be46ed1d798a42bf7e7b306421a1e5ca4aeb" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.258332 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-shd7t" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.355431 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-zd96f"] Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.363174 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="472d7d34-e34d-4b58-9c6a-f290dd6561f0" containerName="download-cache-openstack-openstack-cell1" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.363197 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="472d7d34-e34d-4b58-9c6a-f290dd6561f0" containerName="download-cache-openstack-openstack-cell1" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.363413 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="472d7d34-e34d-4b58-9c6a-f290dd6561f0" containerName="download-cache-openstack-openstack-cell1" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.369791 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.371254 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-zd96f"] Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.379101 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.379301 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.379644 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.379820 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.531188 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ceph\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.531258 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wprmt\" (UniqueName: \"kubernetes.io/projected/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-kube-api-access-wprmt\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.531372 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ssh-key\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.531451 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-inventory\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.633220 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ssh-key\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.633335 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-inventory\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.633410 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ceph\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.633449 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wprmt\" (UniqueName: \"kubernetes.io/projected/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-kube-api-access-wprmt\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.639710 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-inventory\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.643226 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ceph\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.644295 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ssh-key\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.650242 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wprmt\" (UniqueName: \"kubernetes.io/projected/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-kube-api-access-wprmt\") pod \"configure-network-openstack-openstack-cell1-zd96f\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:21 crc kubenswrapper[5200]: I1126 15:34:21.699586 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:34:22 crc kubenswrapper[5200]: I1126 15:34:22.239494 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-zd96f"] Nov 26 15:34:22 crc kubenswrapper[5200]: I1126 15:34:22.279778 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" event={"ID":"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c","Type":"ContainerStarted","Data":"288d27d4f43e46235fcd3c42c8bf35490f0ec1e8643b293a45a262effb1d4e31"} Nov 26 15:34:24 crc kubenswrapper[5200]: I1126 15:34:24.301643 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" event={"ID":"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c","Type":"ContainerStarted","Data":"d57ce05ac151a85cb057ea433ece7f13961e98c2b286708985b1602c20b78ee6"} Nov 26 15:34:24 crc kubenswrapper[5200]: I1126 15:34:24.322538 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" podStartSLOduration=1.6353750740000002 podStartE2EDuration="3.322520369s" podCreationTimestamp="2025-11-26 15:34:21 +0000 UTC" firstStartedPulling="2025-11-26 15:34:22.242729649 +0000 UTC m=+7430.921931467" lastFinishedPulling="2025-11-26 15:34:23.929874904 +0000 UTC m=+7432.609076762" observedRunningTime="2025-11-26 15:34:24.317565056 +0000 UTC m=+7432.996766874" watchObservedRunningTime="2025-11-26 15:34:24.322520369 +0000 UTC m=+7433.001722187" Nov 26 15:34:25 crc kubenswrapper[5200]: I1126 15:34:25.969257 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:34:25 crc kubenswrapper[5200]: E1126 15:34:25.970112 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:34:40 crc kubenswrapper[5200]: I1126 15:34:40.971902 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:34:41 crc kubenswrapper[5200]: I1126 15:34:41.461827 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"905a4b74f3f4f5dcadc2b620234735b5646a4c3201da89896a20ea9b3d410457"} Nov 26 15:34:42 crc kubenswrapper[5200]: E1126 15:34:42.669360 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441917 actualBytes=10240 Nov 26 15:35:38 crc kubenswrapper[5200]: I1126 15:35:38.352919 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:35:38 crc kubenswrapper[5200]: I1126 15:35:38.361226 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:35:38 crc kubenswrapper[5200]: I1126 15:35:38.362659 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:35:38 crc kubenswrapper[5200]: I1126 15:35:38.375158 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:35:40 crc kubenswrapper[5200]: I1126 15:35:40.068961 5200 generic.go:358] "Generic (PLEG): container finished" podID="e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c" containerID="d57ce05ac151a85cb057ea433ece7f13961e98c2b286708985b1602c20b78ee6" exitCode=0 Nov 26 15:35:40 crc kubenswrapper[5200]: I1126 15:35:40.069625 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" event={"ID":"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c","Type":"ContainerDied","Data":"d57ce05ac151a85cb057ea433ece7f13961e98c2b286708985b1602c20b78ee6"} Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.773445 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.867938 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-inventory\") pod \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.868136 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ceph\") pod \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.868530 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wprmt\" (UniqueName: \"kubernetes.io/projected/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-kube-api-access-wprmt\") pod \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.868645 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ssh-key\") pod \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\" (UID: \"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c\") " Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.880835 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ceph" (OuterVolumeSpecName: "ceph") pod "e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c" (UID: "e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.880900 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-kube-api-access-wprmt" (OuterVolumeSpecName: "kube-api-access-wprmt") pod "e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c" (UID: "e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c"). InnerVolumeSpecName "kube-api-access-wprmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.900956 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-inventory" (OuterVolumeSpecName: "inventory") pod "e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c" (UID: "e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.909030 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c" (UID: "e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.972217 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wprmt\" (UniqueName: \"kubernetes.io/projected/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-kube-api-access-wprmt\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.972239 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.972250 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:41 crc kubenswrapper[5200]: I1126 15:35:41.972260 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.089239 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.089271 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-zd96f" event={"ID":"e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c","Type":"ContainerDied","Data":"288d27d4f43e46235fcd3c42c8bf35490f0ec1e8643b293a45a262effb1d4e31"} Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.089309 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288d27d4f43e46235fcd3c42c8bf35490f0ec1e8643b293a45a262effb1d4e31" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.180061 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-7hwpz"] Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.181399 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c" containerName="configure-network-openstack-openstack-cell1" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.181426 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c" containerName="configure-network-openstack-openstack-cell1" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.181624 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c" containerName="configure-network-openstack-openstack-cell1" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.190134 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.190905 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-7hwpz"] Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.193140 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.196397 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.196554 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.196564 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.279591 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ceph\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.279967 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-inventory\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.280200 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wv87\" (UniqueName: \"kubernetes.io/projected/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-kube-api-access-2wv87\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.280497 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ssh-key\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.382482 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wv87\" (UniqueName: \"kubernetes.io/projected/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-kube-api-access-2wv87\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.382586 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ssh-key\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.382721 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ceph\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.382799 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-inventory\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.388583 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ssh-key\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.399318 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-inventory\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.400717 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ceph\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.401094 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wv87\" (UniqueName: \"kubernetes.io/projected/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-kube-api-access-2wv87\") pod \"validate-network-openstack-openstack-cell1-7hwpz\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.517197 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.709616 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fcjt8"] Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.737196 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcjt8"] Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.737342 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.799001 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-utilities\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.799112 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrtmx\" (UniqueName: \"kubernetes.io/projected/150ac0cd-fed2-488b-b2e9-a5ab295734c0-kube-api-access-mrtmx\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.799491 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-catalog-content\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.904258 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-catalog-content\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.904433 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-utilities\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.904916 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-catalog-content\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.904994 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-utilities\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.905138 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrtmx\" (UniqueName: \"kubernetes.io/projected/150ac0cd-fed2-488b-b2e9-a5ab295734c0-kube-api-access-mrtmx\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:42 crc kubenswrapper[5200]: I1126 15:35:42.924861 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrtmx\" (UniqueName: \"kubernetes.io/projected/150ac0cd-fed2-488b-b2e9-a5ab295734c0-kube-api-access-mrtmx\") pod \"redhat-marketplace-fcjt8\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:43 crc kubenswrapper[5200]: I1126 15:35:43.083921 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:43 crc kubenswrapper[5200]: I1126 15:35:43.156106 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-7hwpz"] Nov 26 15:35:43 crc kubenswrapper[5200]: E1126 15:35:43.377183 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2435136 actualBytes=10240 Nov 26 15:35:43 crc kubenswrapper[5200]: I1126 15:35:43.593393 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcjt8"] Nov 26 15:35:43 crc kubenswrapper[5200]: W1126 15:35:43.593761 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod150ac0cd_fed2_488b_b2e9_a5ab295734c0.slice/crio-8d73737abb06f586e862e3e61b4dff73930f5d31e6b56862ce293db577327835 WatchSource:0}: Error finding container 8d73737abb06f586e862e3e61b4dff73930f5d31e6b56862ce293db577327835: Status 404 returned error can't find the container with id 8d73737abb06f586e862e3e61b4dff73930f5d31e6b56862ce293db577327835 Nov 26 15:35:44 crc kubenswrapper[5200]: I1126 15:35:44.123069 5200 generic.go:358] "Generic (PLEG): container finished" podID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerID="9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286" exitCode=0 Nov 26 15:35:44 crc kubenswrapper[5200]: I1126 15:35:44.123139 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcjt8" event={"ID":"150ac0cd-fed2-488b-b2e9-a5ab295734c0","Type":"ContainerDied","Data":"9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286"} Nov 26 15:35:44 crc kubenswrapper[5200]: I1126 15:35:44.123506 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcjt8" event={"ID":"150ac0cd-fed2-488b-b2e9-a5ab295734c0","Type":"ContainerStarted","Data":"8d73737abb06f586e862e3e61b4dff73930f5d31e6b56862ce293db577327835"} Nov 26 15:35:44 crc kubenswrapper[5200]: I1126 15:35:44.127396 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" event={"ID":"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c","Type":"ContainerStarted","Data":"58e000298c07beb08b56166cd9e1c7005ee384ad8519029d33522891e44842ea"} Nov 26 15:35:44 crc kubenswrapper[5200]: I1126 15:35:44.127465 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" event={"ID":"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c","Type":"ContainerStarted","Data":"7302147452eafcd59875681a8ac11d25ac5fb15a54467ae261b2d7ddba3c23e5"} Nov 26 15:35:44 crc kubenswrapper[5200]: I1126 15:35:44.159707 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" podStartSLOduration=1.668296402 podStartE2EDuration="2.159667563s" podCreationTimestamp="2025-11-26 15:35:42 +0000 UTC" firstStartedPulling="2025-11-26 15:35:43.17467051 +0000 UTC m=+7511.853872328" lastFinishedPulling="2025-11-26 15:35:43.666041671 +0000 UTC m=+7512.345243489" observedRunningTime="2025-11-26 15:35:44.156059126 +0000 UTC m=+7512.835260954" watchObservedRunningTime="2025-11-26 15:35:44.159667563 +0000 UTC m=+7512.838869381" Nov 26 15:35:46 crc kubenswrapper[5200]: I1126 15:35:46.148851 5200 generic.go:358] "Generic (PLEG): container finished" podID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerID="991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af" exitCode=0 Nov 26 15:35:46 crc kubenswrapper[5200]: I1126 15:35:46.149087 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcjt8" event={"ID":"150ac0cd-fed2-488b-b2e9-a5ab295734c0","Type":"ContainerDied","Data":"991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af"} Nov 26 15:35:47 crc kubenswrapper[5200]: I1126 15:35:47.165381 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcjt8" event={"ID":"150ac0cd-fed2-488b-b2e9-a5ab295734c0","Type":"ContainerStarted","Data":"93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3"} Nov 26 15:35:47 crc kubenswrapper[5200]: I1126 15:35:47.199904 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fcjt8" podStartSLOduration=4.119402924 podStartE2EDuration="5.199881986s" podCreationTimestamp="2025-11-26 15:35:42 +0000 UTC" firstStartedPulling="2025-11-26 15:35:44.124230463 +0000 UTC m=+7512.803432281" lastFinishedPulling="2025-11-26 15:35:45.204709525 +0000 UTC m=+7513.883911343" observedRunningTime="2025-11-26 15:35:47.189618301 +0000 UTC m=+7515.868820129" watchObservedRunningTime="2025-11-26 15:35:47.199881986 +0000 UTC m=+7515.879083824" Nov 26 15:35:49 crc kubenswrapper[5200]: I1126 15:35:49.189718 5200 generic.go:358] "Generic (PLEG): container finished" podID="bb94087e-4cfd-41c7-8da0-43fe92ef5d2c" containerID="58e000298c07beb08b56166cd9e1c7005ee384ad8519029d33522891e44842ea" exitCode=0 Nov 26 15:35:49 crc kubenswrapper[5200]: I1126 15:35:49.189811 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" event={"ID":"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c","Type":"ContainerDied","Data":"58e000298c07beb08b56166cd9e1c7005ee384ad8519029d33522891e44842ea"} Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.827833 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.923894 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-inventory\") pod \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.925080 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ssh-key\") pod \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.925191 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ceph\") pod \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.925484 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wv87\" (UniqueName: \"kubernetes.io/projected/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-kube-api-access-2wv87\") pod \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\" (UID: \"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c\") " Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.930508 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ceph" (OuterVolumeSpecName: "ceph") pod "bb94087e-4cfd-41c7-8da0-43fe92ef5d2c" (UID: "bb94087e-4cfd-41c7-8da0-43fe92ef5d2c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.930768 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-kube-api-access-2wv87" (OuterVolumeSpecName: "kube-api-access-2wv87") pod "bb94087e-4cfd-41c7-8da0-43fe92ef5d2c" (UID: "bb94087e-4cfd-41c7-8da0-43fe92ef5d2c"). InnerVolumeSpecName "kube-api-access-2wv87". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.957621 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb94087e-4cfd-41c7-8da0-43fe92ef5d2c" (UID: "bb94087e-4cfd-41c7-8da0-43fe92ef5d2c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:35:50 crc kubenswrapper[5200]: I1126 15:35:50.958542 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-inventory" (OuterVolumeSpecName: "inventory") pod "bb94087e-4cfd-41c7-8da0-43fe92ef5d2c" (UID: "bb94087e-4cfd-41c7-8da0-43fe92ef5d2c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.028791 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2wv87\" (UniqueName: \"kubernetes.io/projected/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-kube-api-access-2wv87\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.028825 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.028834 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.028843 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb94087e-4cfd-41c7-8da0-43fe92ef5d2c-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.212034 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.212215 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-7hwpz" event={"ID":"bb94087e-4cfd-41c7-8da0-43fe92ef5d2c","Type":"ContainerDied","Data":"7302147452eafcd59875681a8ac11d25ac5fb15a54467ae261b2d7ddba3c23e5"} Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.212253 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7302147452eafcd59875681a8ac11d25ac5fb15a54467ae261b2d7ddba3c23e5" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.270720 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kbtn4"] Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.272174 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb94087e-4cfd-41c7-8da0-43fe92ef5d2c" containerName="validate-network-openstack-openstack-cell1" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.272198 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb94087e-4cfd-41c7-8da0-43fe92ef5d2c" containerName="validate-network-openstack-openstack-cell1" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.272554 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb94087e-4cfd-41c7-8da0-43fe92ef5d2c" containerName="validate-network-openstack-openstack-cell1" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.286380 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kbtn4"] Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.286522 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.289436 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.289483 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.289499 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.289543 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.337316 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-inventory\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.337383 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ssh-key\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.337436 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74j8\" (UniqueName: \"kubernetes.io/projected/8e8a19ae-5836-403f-92cc-d102b1fbe3af-kube-api-access-z74j8\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.337894 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ceph\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.440076 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ceph\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.440184 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-inventory\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.440235 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ssh-key\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.440288 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z74j8\" (UniqueName: \"kubernetes.io/projected/8e8a19ae-5836-403f-92cc-d102b1fbe3af-kube-api-access-z74j8\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.444713 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ceph\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.450962 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ssh-key\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.451845 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-inventory\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.468787 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74j8\" (UniqueName: \"kubernetes.io/projected/8e8a19ae-5836-403f-92cc-d102b1fbe3af-kube-api-access-z74j8\") pod \"install-os-openstack-openstack-cell1-kbtn4\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:51 crc kubenswrapper[5200]: I1126 15:35:51.607541 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:35:52 crc kubenswrapper[5200]: I1126 15:35:52.166644 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kbtn4"] Nov 26 15:35:52 crc kubenswrapper[5200]: I1126 15:35:52.224000 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" event={"ID":"8e8a19ae-5836-403f-92cc-d102b1fbe3af","Type":"ContainerStarted","Data":"81e7646c48c9683d53d8e21a2b09bb056af50f82218cbd8e5729e98fb58df5e1"} Nov 26 15:35:53 crc kubenswrapper[5200]: I1126 15:35:53.084423 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:53 crc kubenswrapper[5200]: I1126 15:35:53.085486 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:53 crc kubenswrapper[5200]: I1126 15:35:53.139719 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:53 crc kubenswrapper[5200]: I1126 15:35:53.236401 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" event={"ID":"8e8a19ae-5836-403f-92cc-d102b1fbe3af","Type":"ContainerStarted","Data":"14befa6d60b50bd38b6c09d11314e8c9118c5eb5c4cbbe12dc58dc57783771ef"} Nov 26 15:35:53 crc kubenswrapper[5200]: I1126 15:35:53.285833 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" podStartSLOduration=1.793403912 podStartE2EDuration="2.285815511s" podCreationTimestamp="2025-11-26 15:35:51 +0000 UTC" firstStartedPulling="2025-11-26 15:35:52.176418623 +0000 UTC m=+7520.855620441" lastFinishedPulling="2025-11-26 15:35:52.668830222 +0000 UTC m=+7521.348032040" observedRunningTime="2025-11-26 15:35:53.252048556 +0000 UTC m=+7521.931250384" watchObservedRunningTime="2025-11-26 15:35:53.285815511 +0000 UTC m=+7521.965017329" Nov 26 15:35:53 crc kubenswrapper[5200]: I1126 15:35:53.313659 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:53 crc kubenswrapper[5200]: I1126 15:35:53.384140 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcjt8"] Nov 26 15:35:55 crc kubenswrapper[5200]: I1126 15:35:55.253378 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fcjt8" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerName="registry-server" containerID="cri-o://93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3" gracePeriod=2 Nov 26 15:35:55 crc kubenswrapper[5200]: I1126 15:35:55.878407 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:55 crc kubenswrapper[5200]: I1126 15:35:55.968439 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-catalog-content\") pod \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " Nov 26 15:35:55 crc kubenswrapper[5200]: I1126 15:35:55.968578 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrtmx\" (UniqueName: \"kubernetes.io/projected/150ac0cd-fed2-488b-b2e9-a5ab295734c0-kube-api-access-mrtmx\") pod \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " Nov 26 15:35:55 crc kubenswrapper[5200]: I1126 15:35:55.968730 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-utilities\") pod \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\" (UID: \"150ac0cd-fed2-488b-b2e9-a5ab295734c0\") " Nov 26 15:35:55 crc kubenswrapper[5200]: I1126 15:35:55.969730 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-utilities" (OuterVolumeSpecName: "utilities") pod "150ac0cd-fed2-488b-b2e9-a5ab295734c0" (UID: "150ac0cd-fed2-488b-b2e9-a5ab295734c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:35:55 crc kubenswrapper[5200]: I1126 15:35:55.975950 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150ac0cd-fed2-488b-b2e9-a5ab295734c0-kube-api-access-mrtmx" (OuterVolumeSpecName: "kube-api-access-mrtmx") pod "150ac0cd-fed2-488b-b2e9-a5ab295734c0" (UID: "150ac0cd-fed2-488b-b2e9-a5ab295734c0"). InnerVolumeSpecName "kube-api-access-mrtmx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:35:55 crc kubenswrapper[5200]: I1126 15:35:55.978627 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "150ac0cd-fed2-488b-b2e9-a5ab295734c0" (UID: "150ac0cd-fed2-488b-b2e9-a5ab295734c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.073289 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.073325 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrtmx\" (UniqueName: \"kubernetes.io/projected/150ac0cd-fed2-488b-b2e9-a5ab295734c0-kube-api-access-mrtmx\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.073339 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150ac0cd-fed2-488b-b2e9-a5ab295734c0-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.268385 5200 generic.go:358] "Generic (PLEG): container finished" podID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerID="93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3" exitCode=0 Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.268659 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcjt8" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.268672 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcjt8" event={"ID":"150ac0cd-fed2-488b-b2e9-a5ab295734c0","Type":"ContainerDied","Data":"93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3"} Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.268944 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcjt8" event={"ID":"150ac0cd-fed2-488b-b2e9-a5ab295734c0","Type":"ContainerDied","Data":"8d73737abb06f586e862e3e61b4dff73930f5d31e6b56862ce293db577327835"} Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.268977 5200 scope.go:117] "RemoveContainer" containerID="93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.312774 5200 scope.go:117] "RemoveContainer" containerID="991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.320972 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcjt8"] Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.331076 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcjt8"] Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.348781 5200 scope.go:117] "RemoveContainer" containerID="9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.386758 5200 scope.go:117] "RemoveContainer" containerID="93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3" Nov 26 15:35:56 crc kubenswrapper[5200]: E1126 15:35:56.387451 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3\": container with ID starting with 93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3 not found: ID does not exist" containerID="93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.387484 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3"} err="failed to get container status \"93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3\": rpc error: code = NotFound desc = could not find container \"93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3\": container with ID starting with 93ed4f607349ce1ce9f46488af9e48b2971958fdfb66fd7b13afc44be25cd2b3 not found: ID does not exist" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.387503 5200 scope.go:117] "RemoveContainer" containerID="991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af" Nov 26 15:35:56 crc kubenswrapper[5200]: E1126 15:35:56.387825 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af\": container with ID starting with 991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af not found: ID does not exist" containerID="991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.387847 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af"} err="failed to get container status \"991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af\": rpc error: code = NotFound desc = could not find container \"991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af\": container with ID starting with 991d8ac0533884e0a17ea30325659e7542625b45d4d578ef09c1cc166c2814af not found: ID does not exist" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.387859 5200 scope.go:117] "RemoveContainer" containerID="9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286" Nov 26 15:35:56 crc kubenswrapper[5200]: E1126 15:35:56.388089 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286\": container with ID starting with 9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286 not found: ID does not exist" containerID="9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.388115 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286"} err="failed to get container status \"9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286\": rpc error: code = NotFound desc = could not find container \"9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286\": container with ID starting with 9c8910acb23858aaae1ed5d7bfc3ea2153be71df1c727062e63e2892743ad286 not found: ID does not exist" Nov 26 15:35:56 crc kubenswrapper[5200]: I1126 15:35:56.987481 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" path="/var/lib/kubelet/pods/150ac0cd-fed2-488b-b2e9-a5ab295734c0/volumes" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.779410 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wrsdd"] Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.780979 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerName="extract-content" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.781000 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerName="extract-content" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.781018 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerName="registry-server" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.781025 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerName="registry-server" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.781073 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerName="extract-utilities" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.781081 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerName="extract-utilities" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.781281 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="150ac0cd-fed2-488b-b2e9-a5ab295734c0" containerName="registry-server" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.814291 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.831081 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wrsdd"] Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.837337 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-catalog-content\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.837542 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-utilities\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.837879 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mpv\" (UniqueName: \"kubernetes.io/projected/7e46d522-872b-45e6-94da-b17a2d9fa81e-kube-api-access-47mpv\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.940008 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47mpv\" (UniqueName: \"kubernetes.io/projected/7e46d522-872b-45e6-94da-b17a2d9fa81e-kube-api-access-47mpv\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.940164 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-catalog-content\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.940265 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-utilities\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.940706 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-catalog-content\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.940756 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-utilities\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:58 crc kubenswrapper[5200]: I1126 15:35:58.968993 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47mpv\" (UniqueName: \"kubernetes.io/projected/7e46d522-872b-45e6-94da-b17a2d9fa81e-kube-api-access-47mpv\") pod \"redhat-operators-wrsdd\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:59 crc kubenswrapper[5200]: I1126 15:35:59.155060 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:35:59 crc kubenswrapper[5200]: I1126 15:35:59.649295 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wrsdd"] Nov 26 15:36:00 crc kubenswrapper[5200]: I1126 15:36:00.319873 5200 generic.go:358] "Generic (PLEG): container finished" podID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerID="9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c" exitCode=0 Nov 26 15:36:00 crc kubenswrapper[5200]: I1126 15:36:00.319944 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrsdd" event={"ID":"7e46d522-872b-45e6-94da-b17a2d9fa81e","Type":"ContainerDied","Data":"9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c"} Nov 26 15:36:00 crc kubenswrapper[5200]: I1126 15:36:00.320447 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrsdd" event={"ID":"7e46d522-872b-45e6-94da-b17a2d9fa81e","Type":"ContainerStarted","Data":"815322bb27074bb9c7027f98d344df820f4e54f0ad20b56b5be42b78b6452851"} Nov 26 15:36:02 crc kubenswrapper[5200]: I1126 15:36:02.349039 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrsdd" event={"ID":"7e46d522-872b-45e6-94da-b17a2d9fa81e","Type":"ContainerStarted","Data":"0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e"} Nov 26 15:36:05 crc kubenswrapper[5200]: I1126 15:36:05.378860 5200 generic.go:358] "Generic (PLEG): container finished" podID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerID="0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e" exitCode=0 Nov 26 15:36:05 crc kubenswrapper[5200]: I1126 15:36:05.378960 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrsdd" event={"ID":"7e46d522-872b-45e6-94da-b17a2d9fa81e","Type":"ContainerDied","Data":"0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e"} Nov 26 15:36:06 crc kubenswrapper[5200]: I1126 15:36:06.391637 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrsdd" event={"ID":"7e46d522-872b-45e6-94da-b17a2d9fa81e","Type":"ContainerStarted","Data":"3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade"} Nov 26 15:36:09 crc kubenswrapper[5200]: I1126 15:36:09.155907 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:36:09 crc kubenswrapper[5200]: I1126 15:36:09.156527 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:36:10 crc kubenswrapper[5200]: I1126 15:36:10.210725 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wrsdd" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="registry-server" probeResult="failure" output=< Nov 26 15:36:10 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:36:10 crc kubenswrapper[5200]: > Nov 26 15:36:19 crc kubenswrapper[5200]: I1126 15:36:19.205177 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:36:19 crc kubenswrapper[5200]: I1126 15:36:19.226960 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wrsdd" podStartSLOduration=19.930144617 podStartE2EDuration="21.226939067s" podCreationTimestamp="2025-11-26 15:35:58 +0000 UTC" firstStartedPulling="2025-11-26 15:36:00.321218276 +0000 UTC m=+7529.000420094" lastFinishedPulling="2025-11-26 15:36:01.618012726 +0000 UTC m=+7530.297214544" observedRunningTime="2025-11-26 15:36:06.41949543 +0000 UTC m=+7535.098697248" watchObservedRunningTime="2025-11-26 15:36:19.226939067 +0000 UTC m=+7547.906140885" Nov 26 15:36:19 crc kubenswrapper[5200]: I1126 15:36:19.262843 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:36:19 crc kubenswrapper[5200]: I1126 15:36:19.437518 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wrsdd"] Nov 26 15:36:20 crc kubenswrapper[5200]: I1126 15:36:20.268590 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wrsdd" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="registry-server" containerID="cri-o://3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade" gracePeriod=2 Nov 26 15:36:20 crc kubenswrapper[5200]: I1126 15:36:20.817376 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:36:20 crc kubenswrapper[5200]: I1126 15:36:20.895774 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-utilities\") pod \"7e46d522-872b-45e6-94da-b17a2d9fa81e\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " Nov 26 15:36:20 crc kubenswrapper[5200]: I1126 15:36:20.896026 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47mpv\" (UniqueName: \"kubernetes.io/projected/7e46d522-872b-45e6-94da-b17a2d9fa81e-kube-api-access-47mpv\") pod \"7e46d522-872b-45e6-94da-b17a2d9fa81e\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " Nov 26 15:36:20 crc kubenswrapper[5200]: I1126 15:36:20.896057 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-catalog-content\") pod \"7e46d522-872b-45e6-94da-b17a2d9fa81e\" (UID: \"7e46d522-872b-45e6-94da-b17a2d9fa81e\") " Nov 26 15:36:20 crc kubenswrapper[5200]: I1126 15:36:20.896959 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-utilities" (OuterVolumeSpecName: "utilities") pod "7e46d522-872b-45e6-94da-b17a2d9fa81e" (UID: "7e46d522-872b-45e6-94da-b17a2d9fa81e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:36:20 crc kubenswrapper[5200]: I1126 15:36:20.907128 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e46d522-872b-45e6-94da-b17a2d9fa81e-kube-api-access-47mpv" (OuterVolumeSpecName: "kube-api-access-47mpv") pod "7e46d522-872b-45e6-94da-b17a2d9fa81e" (UID: "7e46d522-872b-45e6-94da-b17a2d9fa81e"). InnerVolumeSpecName "kube-api-access-47mpv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:36:20 crc kubenswrapper[5200]: I1126 15:36:20.981735 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e46d522-872b-45e6-94da-b17a2d9fa81e" (UID: "7e46d522-872b-45e6-94da-b17a2d9fa81e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.000102 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.000150 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47mpv\" (UniqueName: \"kubernetes.io/projected/7e46d522-872b-45e6-94da-b17a2d9fa81e-kube-api-access-47mpv\") on node \"crc\" DevicePath \"\"" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.000162 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e46d522-872b-45e6-94da-b17a2d9fa81e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.278413 5200 generic.go:358] "Generic (PLEG): container finished" podID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerID="3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade" exitCode=0 Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.278462 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrsdd" event={"ID":"7e46d522-872b-45e6-94da-b17a2d9fa81e","Type":"ContainerDied","Data":"3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade"} Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.278516 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrsdd" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.278528 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrsdd" event={"ID":"7e46d522-872b-45e6-94da-b17a2d9fa81e","Type":"ContainerDied","Data":"815322bb27074bb9c7027f98d344df820f4e54f0ad20b56b5be42b78b6452851"} Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.278551 5200 scope.go:117] "RemoveContainer" containerID="3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.308891 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wrsdd"] Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.318465 5200 scope.go:117] "RemoveContainer" containerID="0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.319456 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wrsdd"] Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.350538 5200 scope.go:117] "RemoveContainer" containerID="9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.382203 5200 scope.go:117] "RemoveContainer" containerID="3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade" Nov 26 15:36:21 crc kubenswrapper[5200]: E1126 15:36:21.382598 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade\": container with ID starting with 3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade not found: ID does not exist" containerID="3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.382637 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade"} err="failed to get container status \"3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade\": rpc error: code = NotFound desc = could not find container \"3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade\": container with ID starting with 3ab9bd43c368cb72340ab2a6138f3650b65b70b87dd4929a0551470b117d3ade not found: ID does not exist" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.382662 5200 scope.go:117] "RemoveContainer" containerID="0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e" Nov 26 15:36:21 crc kubenswrapper[5200]: E1126 15:36:21.383093 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e\": container with ID starting with 0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e not found: ID does not exist" containerID="0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.383120 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e"} err="failed to get container status \"0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e\": rpc error: code = NotFound desc = could not find container \"0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e\": container with ID starting with 0d6e7358603eb61907f1094d958e4cd6ef2bad139ed656a9a9811c3d20ad884e not found: ID does not exist" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.383148 5200 scope.go:117] "RemoveContainer" containerID="9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c" Nov 26 15:36:21 crc kubenswrapper[5200]: E1126 15:36:21.383377 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c\": container with ID starting with 9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c not found: ID does not exist" containerID="9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c" Nov 26 15:36:21 crc kubenswrapper[5200]: I1126 15:36:21.383395 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c"} err="failed to get container status \"9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c\": rpc error: code = NotFound desc = could not find container \"9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c\": container with ID starting with 9b15fac1e7c38b89832e4165c3a6a366ae02a27c8de2556f875be0459f087f3c not found: ID does not exist" Nov 26 15:36:23 crc kubenswrapper[5200]: I1126 15:36:23.004425 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" path="/var/lib/kubelet/pods/7e46d522-872b-45e6-94da-b17a2d9fa81e/volumes" Nov 26 15:36:36 crc kubenswrapper[5200]: I1126 15:36:36.445512 5200 generic.go:358] "Generic (PLEG): container finished" podID="8e8a19ae-5836-403f-92cc-d102b1fbe3af" containerID="14befa6d60b50bd38b6c09d11314e8c9118c5eb5c4cbbe12dc58dc57783771ef" exitCode=0 Nov 26 15:36:36 crc kubenswrapper[5200]: I1126 15:36:36.446210 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" event={"ID":"8e8a19ae-5836-403f-92cc-d102b1fbe3af","Type":"ContainerDied","Data":"14befa6d60b50bd38b6c09d11314e8c9118c5eb5c4cbbe12dc58dc57783771ef"} Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.085061 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.111601 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ssh-key\") pod \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.111976 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-inventory\") pod \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.112053 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z74j8\" (UniqueName: \"kubernetes.io/projected/8e8a19ae-5836-403f-92cc-d102b1fbe3af-kube-api-access-z74j8\") pod \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.112197 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ceph\") pod \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\" (UID: \"8e8a19ae-5836-403f-92cc-d102b1fbe3af\") " Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.125794 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ceph" (OuterVolumeSpecName: "ceph") pod "8e8a19ae-5836-403f-92cc-d102b1fbe3af" (UID: "8e8a19ae-5836-403f-92cc-d102b1fbe3af"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.125887 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8a19ae-5836-403f-92cc-d102b1fbe3af-kube-api-access-z74j8" (OuterVolumeSpecName: "kube-api-access-z74j8") pod "8e8a19ae-5836-403f-92cc-d102b1fbe3af" (UID: "8e8a19ae-5836-403f-92cc-d102b1fbe3af"). InnerVolumeSpecName "kube-api-access-z74j8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.175117 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-inventory" (OuterVolumeSpecName: "inventory") pod "8e8a19ae-5836-403f-92cc-d102b1fbe3af" (UID: "8e8a19ae-5836-403f-92cc-d102b1fbe3af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.195834 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8e8a19ae-5836-403f-92cc-d102b1fbe3af" (UID: "8e8a19ae-5836-403f-92cc-d102b1fbe3af"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.214849 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z74j8\" (UniqueName: \"kubernetes.io/projected/8e8a19ae-5836-403f-92cc-d102b1fbe3af-kube-api-access-z74j8\") on node \"crc\" DevicePath \"\"" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.214896 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.214910 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.214921 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e8a19ae-5836-403f-92cc-d102b1fbe3af-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.471586 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" event={"ID":"8e8a19ae-5836-403f-92cc-d102b1fbe3af","Type":"ContainerDied","Data":"81e7646c48c9683d53d8e21a2b09bb056af50f82218cbd8e5729e98fb58df5e1"} Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.471628 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81e7646c48c9683d53d8e21a2b09bb056af50f82218cbd8e5729e98fb58df5e1" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.471705 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kbtn4" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.548898 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rgb5g"] Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.550364 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="registry-server" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.550458 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="registry-server" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.550544 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8a19ae-5836-403f-92cc-d102b1fbe3af" containerName="install-os-openstack-openstack-cell1" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.550611 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8a19ae-5836-403f-92cc-d102b1fbe3af" containerName="install-os-openstack-openstack-cell1" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.550704 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="extract-content" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.550772 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="extract-content" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.550857 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="extract-utilities" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.550930 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="extract-utilities" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.551184 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e46d522-872b-45e6-94da-b17a2d9fa81e" containerName="registry-server" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.551279 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e8a19ae-5836-403f-92cc-d102b1fbe3af" containerName="install-os-openstack-openstack-cell1" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.556944 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.559725 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.560226 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.560884 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.561128 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.568080 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rgb5g"] Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.623057 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.623137 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ceph\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.623199 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snkfv\" (UniqueName: \"kubernetes.io/projected/603bb6c7-0b64-4c48-ac82-ce369807a22d-kube-api-access-snkfv\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.623491 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-inventory\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.725484 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-inventory\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.725595 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.725651 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ceph\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.725698 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snkfv\" (UniqueName: \"kubernetes.io/projected/603bb6c7-0b64-4c48-ac82-ce369807a22d-kube-api-access-snkfv\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.730954 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-inventory\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.731435 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.732447 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ceph\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.745582 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snkfv\" (UniqueName: \"kubernetes.io/projected/603bb6c7-0b64-4c48-ac82-ce369807a22d-kube-api-access-snkfv\") pod \"configure-os-openstack-openstack-cell1-rgb5g\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:38 crc kubenswrapper[5200]: I1126 15:36:38.880184 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:36:39 crc kubenswrapper[5200]: I1126 15:36:39.451308 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rgb5g"] Nov 26 15:36:39 crc kubenswrapper[5200]: W1126 15:36:39.464754 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod603bb6c7_0b64_4c48_ac82_ce369807a22d.slice/crio-8cd851982216cd16247417ae71f5845eedc9443834d22ad7cebac51f3490800e WatchSource:0}: Error finding container 8cd851982216cd16247417ae71f5845eedc9443834d22ad7cebac51f3490800e: Status 404 returned error can't find the container with id 8cd851982216cd16247417ae71f5845eedc9443834d22ad7cebac51f3490800e Nov 26 15:36:39 crc kubenswrapper[5200]: I1126 15:36:39.489335 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" event={"ID":"603bb6c7-0b64-4c48-ac82-ce369807a22d","Type":"ContainerStarted","Data":"8cd851982216cd16247417ae71f5845eedc9443834d22ad7cebac51f3490800e"} Nov 26 15:36:40 crc kubenswrapper[5200]: I1126 15:36:40.502244 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" event={"ID":"603bb6c7-0b64-4c48-ac82-ce369807a22d","Type":"ContainerStarted","Data":"d5b68f974aeddef3fac83ecf24d9d3c8811bb74e4ca78f43d59994031e8f6bbd"} Nov 26 15:36:40 crc kubenswrapper[5200]: I1126 15:36:40.518483 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" podStartSLOduration=1.765561038 podStartE2EDuration="2.518454118s" podCreationTimestamp="2025-11-26 15:36:38 +0000 UTC" firstStartedPulling="2025-11-26 15:36:39.467096727 +0000 UTC m=+7568.146298585" lastFinishedPulling="2025-11-26 15:36:40.219989847 +0000 UTC m=+7568.899191665" observedRunningTime="2025-11-26 15:36:40.517216945 +0000 UTC m=+7569.196418773" watchObservedRunningTime="2025-11-26 15:36:40.518454118 +0000 UTC m=+7569.197655946" Nov 26 15:36:42 crc kubenswrapper[5200]: E1126 15:36:42.634178 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441842 actualBytes=10240 Nov 26 15:37:03 crc kubenswrapper[5200]: I1126 15:37:03.154622 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:37:03 crc kubenswrapper[5200]: I1126 15:37:03.156564 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:37:24 crc kubenswrapper[5200]: I1126 15:37:24.935787 5200 generic.go:358] "Generic (PLEG): container finished" podID="603bb6c7-0b64-4c48-ac82-ce369807a22d" containerID="d5b68f974aeddef3fac83ecf24d9d3c8811bb74e4ca78f43d59994031e8f6bbd" exitCode=0 Nov 26 15:37:24 crc kubenswrapper[5200]: I1126 15:37:24.935895 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" event={"ID":"603bb6c7-0b64-4c48-ac82-ce369807a22d","Type":"ContainerDied","Data":"d5b68f974aeddef3fac83ecf24d9d3c8811bb74e4ca78f43d59994031e8f6bbd"} Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.469887 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.586011 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ceph\") pod \"603bb6c7-0b64-4c48-ac82-ce369807a22d\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.586716 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-inventory\") pod \"603bb6c7-0b64-4c48-ac82-ce369807a22d\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.586773 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snkfv\" (UniqueName: \"kubernetes.io/projected/603bb6c7-0b64-4c48-ac82-ce369807a22d-kube-api-access-snkfv\") pod \"603bb6c7-0b64-4c48-ac82-ce369807a22d\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.586879 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ssh-key\") pod \"603bb6c7-0b64-4c48-ac82-ce369807a22d\" (UID: \"603bb6c7-0b64-4c48-ac82-ce369807a22d\") " Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.603880 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603bb6c7-0b64-4c48-ac82-ce369807a22d-kube-api-access-snkfv" (OuterVolumeSpecName: "kube-api-access-snkfv") pod "603bb6c7-0b64-4c48-ac82-ce369807a22d" (UID: "603bb6c7-0b64-4c48-ac82-ce369807a22d"). InnerVolumeSpecName "kube-api-access-snkfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.604016 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ceph" (OuterVolumeSpecName: "ceph") pod "603bb6c7-0b64-4c48-ac82-ce369807a22d" (UID: "603bb6c7-0b64-4c48-ac82-ce369807a22d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.627939 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-inventory" (OuterVolumeSpecName: "inventory") pod "603bb6c7-0b64-4c48-ac82-ce369807a22d" (UID: "603bb6c7-0b64-4c48-ac82-ce369807a22d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.630378 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "603bb6c7-0b64-4c48-ac82-ce369807a22d" (UID: "603bb6c7-0b64-4c48-ac82-ce369807a22d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.688898 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.689323 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snkfv\" (UniqueName: \"kubernetes.io/projected/603bb6c7-0b64-4c48-ac82-ce369807a22d-kube-api-access-snkfv\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.690778 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.690923 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/603bb6c7-0b64-4c48-ac82-ce369807a22d-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.957935 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" event={"ID":"603bb6c7-0b64-4c48-ac82-ce369807a22d","Type":"ContainerDied","Data":"8cd851982216cd16247417ae71f5845eedc9443834d22ad7cebac51f3490800e"} Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.957984 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cd851982216cd16247417ae71f5845eedc9443834d22ad7cebac51f3490800e" Nov 26 15:37:26 crc kubenswrapper[5200]: I1126 15:37:26.957983 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rgb5g" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.064799 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-w5njm"] Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.066432 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="603bb6c7-0b64-4c48-ac82-ce369807a22d" containerName="configure-os-openstack-openstack-cell1" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.066461 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="603bb6c7-0b64-4c48-ac82-ce369807a22d" containerName="configure-os-openstack-openstack-cell1" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.066886 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="603bb6c7-0b64-4c48-ac82-ce369807a22d" containerName="configure-os-openstack-openstack-cell1" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.073596 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.080871 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-w5njm"] Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.084275 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.084517 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.084645 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.084933 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.206976 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-inventory-0\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.207234 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmgkx\" (UniqueName: \"kubernetes.io/projected/52cd4880-5b1a-47af-9502-5ce41c93a94f-kube-api-access-jmgkx\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.207349 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ceph\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.207480 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.309993 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmgkx\" (UniqueName: \"kubernetes.io/projected/52cd4880-5b1a-47af-9502-5ce41c93a94f-kube-api-access-jmgkx\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.310245 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ceph\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.310487 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.310763 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-inventory-0\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.315797 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ceph\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.315804 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-inventory-0\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.318255 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.331320 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmgkx\" (UniqueName: \"kubernetes.io/projected/52cd4880-5b1a-47af-9502-5ce41c93a94f-kube-api-access-jmgkx\") pod \"ssh-known-hosts-openstack-w5njm\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.404014 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.957081 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-w5njm"] Nov 26 15:37:27 crc kubenswrapper[5200]: I1126 15:37:27.966800 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:37:28 crc kubenswrapper[5200]: I1126 15:37:28.980887 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-w5njm" event={"ID":"52cd4880-5b1a-47af-9502-5ce41c93a94f","Type":"ContainerStarted","Data":"02821997480b1bc7aa2d46c4028459da6571d11f8d8060f5b268991fce5fa573"} Nov 26 15:37:28 crc kubenswrapper[5200]: I1126 15:37:28.982321 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-w5njm" event={"ID":"52cd4880-5b1a-47af-9502-5ce41c93a94f","Type":"ContainerStarted","Data":"ffeaae942cf8e6c77d3e92a41b60b57897b1e7b1199108a4bc07873f2a1ee05b"} Nov 26 15:37:29 crc kubenswrapper[5200]: I1126 15:37:29.006274 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-w5njm" podStartSLOduration=1.318031599 podStartE2EDuration="2.006228017s" podCreationTimestamp="2025-11-26 15:37:27 +0000 UTC" firstStartedPulling="2025-11-26 15:37:27.967571615 +0000 UTC m=+7616.646773443" lastFinishedPulling="2025-11-26 15:37:28.655768053 +0000 UTC m=+7617.334969861" observedRunningTime="2025-11-26 15:37:29.001074299 +0000 UTC m=+7617.680276117" watchObservedRunningTime="2025-11-26 15:37:29.006228017 +0000 UTC m=+7617.685429835" Nov 26 15:37:33 crc kubenswrapper[5200]: I1126 15:37:33.154130 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:37:33 crc kubenswrapper[5200]: I1126 15:37:33.154762 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:37:37 crc kubenswrapper[5200]: I1126 15:37:37.067461 5200 generic.go:358] "Generic (PLEG): container finished" podID="52cd4880-5b1a-47af-9502-5ce41c93a94f" containerID="02821997480b1bc7aa2d46c4028459da6571d11f8d8060f5b268991fce5fa573" exitCode=0 Nov 26 15:37:37 crc kubenswrapper[5200]: I1126 15:37:37.067517 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-w5njm" event={"ID":"52cd4880-5b1a-47af-9502-5ce41c93a94f","Type":"ContainerDied","Data":"02821997480b1bc7aa2d46c4028459da6571d11f8d8060f5b268991fce5fa573"} Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.541093 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.707670 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ceph\") pod \"52cd4880-5b1a-47af-9502-5ce41c93a94f\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.707819 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmgkx\" (UniqueName: \"kubernetes.io/projected/52cd4880-5b1a-47af-9502-5ce41c93a94f-kube-api-access-jmgkx\") pod \"52cd4880-5b1a-47af-9502-5ce41c93a94f\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.707937 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-inventory-0\") pod \"52cd4880-5b1a-47af-9502-5ce41c93a94f\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.708292 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ssh-key-openstack-cell1\") pod \"52cd4880-5b1a-47af-9502-5ce41c93a94f\" (UID: \"52cd4880-5b1a-47af-9502-5ce41c93a94f\") " Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.716107 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ceph" (OuterVolumeSpecName: "ceph") pod "52cd4880-5b1a-47af-9502-5ce41c93a94f" (UID: "52cd4880-5b1a-47af-9502-5ce41c93a94f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.716175 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cd4880-5b1a-47af-9502-5ce41c93a94f-kube-api-access-jmgkx" (OuterVolumeSpecName: "kube-api-access-jmgkx") pod "52cd4880-5b1a-47af-9502-5ce41c93a94f" (UID: "52cd4880-5b1a-47af-9502-5ce41c93a94f"). InnerVolumeSpecName "kube-api-access-jmgkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.740163 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "52cd4880-5b1a-47af-9502-5ce41c93a94f" (UID: "52cd4880-5b1a-47af-9502-5ce41c93a94f"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.740823 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "52cd4880-5b1a-47af-9502-5ce41c93a94f" (UID: "52cd4880-5b1a-47af-9502-5ce41c93a94f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.812330 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jmgkx\" (UniqueName: \"kubernetes.io/projected/52cd4880-5b1a-47af-9502-5ce41c93a94f-kube-api-access-jmgkx\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.812366 5200 reconciler_common.go:299] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.812377 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:38 crc kubenswrapper[5200]: I1126 15:37:38.812386 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52cd4880-5b1a-47af-9502-5ce41c93a94f-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.095644 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-w5njm" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.095629 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-w5njm" event={"ID":"52cd4880-5b1a-47af-9502-5ce41c93a94f","Type":"ContainerDied","Data":"ffeaae942cf8e6c77d3e92a41b60b57897b1e7b1199108a4bc07873f2a1ee05b"} Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.096055 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffeaae942cf8e6c77d3e92a41b60b57897b1e7b1199108a4bc07873f2a1ee05b" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.172769 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7wkgk"] Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.174287 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52cd4880-5b1a-47af-9502-5ce41c93a94f" containerName="ssh-known-hosts-openstack" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.174311 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cd4880-5b1a-47af-9502-5ce41c93a94f" containerName="ssh-known-hosts-openstack" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.174596 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="52cd4880-5b1a-47af-9502-5ce41c93a94f" containerName="ssh-known-hosts-openstack" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.183353 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.187596 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.187969 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.188097 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.188097 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.196875 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7wkgk"] Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.325671 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p2xr\" (UniqueName: \"kubernetes.io/projected/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-kube-api-access-4p2xr\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.325796 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ceph\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.326381 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-inventory\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.326491 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ssh-key\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.429443 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-inventory\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.429571 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ssh-key\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.429759 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p2xr\" (UniqueName: \"kubernetes.io/projected/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-kube-api-access-4p2xr\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.429811 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ceph\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.435357 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ssh-key\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.435500 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ceph\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.435813 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-inventory\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.448867 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p2xr\" (UniqueName: \"kubernetes.io/projected/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-kube-api-access-4p2xr\") pod \"run-os-openstack-openstack-cell1-7wkgk\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:39 crc kubenswrapper[5200]: I1126 15:37:39.508768 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:40 crc kubenswrapper[5200]: I1126 15:37:40.031009 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7wkgk"] Nov 26 15:37:40 crc kubenswrapper[5200]: I1126 15:37:40.108950 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" event={"ID":"a9cec933-2b92-4db1-a8a7-5f89ea7e303a","Type":"ContainerStarted","Data":"d99f1297a6fb203e0b96393cf1c32b74752255224c2f4ef01b44eac45b914d20"} Nov 26 15:37:41 crc kubenswrapper[5200]: I1126 15:37:41.121388 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" event={"ID":"a9cec933-2b92-4db1-a8a7-5f89ea7e303a","Type":"ContainerStarted","Data":"a997a825e41e966270ceb783e11db06e009a8a457ec6fab9cb2075732a2c9557"} Nov 26 15:37:41 crc kubenswrapper[5200]: I1126 15:37:41.146205 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" podStartSLOduration=1.345846967 podStartE2EDuration="2.146190571s" podCreationTimestamp="2025-11-26 15:37:39 +0000 UTC" firstStartedPulling="2025-11-26 15:37:40.035468517 +0000 UTC m=+7628.714670335" lastFinishedPulling="2025-11-26 15:37:40.835812121 +0000 UTC m=+7629.515013939" observedRunningTime="2025-11-26 15:37:41.14095235 +0000 UTC m=+7629.820154188" watchObservedRunningTime="2025-11-26 15:37:41.146190571 +0000 UTC m=+7629.825392389" Nov 26 15:37:42 crc kubenswrapper[5200]: E1126 15:37:42.650182 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443108 actualBytes=10240 Nov 26 15:37:49 crc kubenswrapper[5200]: I1126 15:37:49.236235 5200 generic.go:358] "Generic (PLEG): container finished" podID="a9cec933-2b92-4db1-a8a7-5f89ea7e303a" containerID="a997a825e41e966270ceb783e11db06e009a8a457ec6fab9cb2075732a2c9557" exitCode=0 Nov 26 15:37:49 crc kubenswrapper[5200]: I1126 15:37:49.236354 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" event={"ID":"a9cec933-2b92-4db1-a8a7-5f89ea7e303a","Type":"ContainerDied","Data":"a997a825e41e966270ceb783e11db06e009a8a457ec6fab9cb2075732a2c9557"} Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.869839 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.906001 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ssh-key\") pod \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.906450 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p2xr\" (UniqueName: \"kubernetes.io/projected/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-kube-api-access-4p2xr\") pod \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.906646 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-inventory\") pod \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.906917 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ceph\") pod \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\" (UID: \"a9cec933-2b92-4db1-a8a7-5f89ea7e303a\") " Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.915518 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-kube-api-access-4p2xr" (OuterVolumeSpecName: "kube-api-access-4p2xr") pod "a9cec933-2b92-4db1-a8a7-5f89ea7e303a" (UID: "a9cec933-2b92-4db1-a8a7-5f89ea7e303a"). InnerVolumeSpecName "kube-api-access-4p2xr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.927305 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ceph" (OuterVolumeSpecName: "ceph") pod "a9cec933-2b92-4db1-a8a7-5f89ea7e303a" (UID: "a9cec933-2b92-4db1-a8a7-5f89ea7e303a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.938875 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-inventory" (OuterVolumeSpecName: "inventory") pod "a9cec933-2b92-4db1-a8a7-5f89ea7e303a" (UID: "a9cec933-2b92-4db1-a8a7-5f89ea7e303a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:50 crc kubenswrapper[5200]: I1126 15:37:50.957626 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a9cec933-2b92-4db1-a8a7-5f89ea7e303a" (UID: "a9cec933-2b92-4db1-a8a7-5f89ea7e303a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.011283 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.011324 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.011335 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.011347 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4p2xr\" (UniqueName: \"kubernetes.io/projected/a9cec933-2b92-4db1-a8a7-5f89ea7e303a-kube-api-access-4p2xr\") on node \"crc\" DevicePath \"\"" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.256902 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.256901 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7wkgk" event={"ID":"a9cec933-2b92-4db1-a8a7-5f89ea7e303a","Type":"ContainerDied","Data":"d99f1297a6fb203e0b96393cf1c32b74752255224c2f4ef01b44eac45b914d20"} Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.257027 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d99f1297a6fb203e0b96393cf1c32b74752255224c2f4ef01b44eac45b914d20" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.335089 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2n9fd"] Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.336371 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9cec933-2b92-4db1-a8a7-5f89ea7e303a" containerName="run-os-openstack-openstack-cell1" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.336392 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9cec933-2b92-4db1-a8a7-5f89ea7e303a" containerName="run-os-openstack-openstack-cell1" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.336607 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9cec933-2b92-4db1-a8a7-5f89ea7e303a" containerName="run-os-openstack-openstack-cell1" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.342425 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.346070 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.346770 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.347145 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.347655 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.348464 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2n9fd"] Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.420600 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjc4p\" (UniqueName: \"kubernetes.io/projected/4677d455-f7eb-4931-b4ea-b334c9610fac-kube-api-access-qjc4p\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.420664 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.420798 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ceph\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.420874 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-inventory\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.522403 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ceph\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.522512 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-inventory\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.522599 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjc4p\" (UniqueName: \"kubernetes.io/projected/4677d455-f7eb-4931-b4ea-b334c9610fac-kube-api-access-qjc4p\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.522636 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.540648 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.540939 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-inventory\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.541569 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ceph\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.543713 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjc4p\" (UniqueName: \"kubernetes.io/projected/4677d455-f7eb-4931-b4ea-b334c9610fac-kube-api-access-qjc4p\") pod \"reboot-os-openstack-openstack-cell1-2n9fd\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:51 crc kubenswrapper[5200]: I1126 15:37:51.663649 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:37:52 crc kubenswrapper[5200]: I1126 15:37:52.221205 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2n9fd"] Nov 26 15:37:52 crc kubenswrapper[5200]: I1126 15:37:52.268008 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" event={"ID":"4677d455-f7eb-4931-b4ea-b334c9610fac","Type":"ContainerStarted","Data":"a92d0dd0b09e51d152a1b19aba3b695da1750593e4a6a0858a73b5ca35e0cbfe"} Nov 26 15:37:53 crc kubenswrapper[5200]: I1126 15:37:53.280367 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" event={"ID":"4677d455-f7eb-4931-b4ea-b334c9610fac","Type":"ContainerStarted","Data":"2660cce209097985f6c8f79410cc3239646094d428bb10ff2066331630c430f2"} Nov 26 15:37:53 crc kubenswrapper[5200]: I1126 15:37:53.298754 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" podStartSLOduration=1.8592501719999999 podStartE2EDuration="2.298735211s" podCreationTimestamp="2025-11-26 15:37:51 +0000 UTC" firstStartedPulling="2025-11-26 15:37:52.226424768 +0000 UTC m=+7640.905626586" lastFinishedPulling="2025-11-26 15:37:52.665909807 +0000 UTC m=+7641.345111625" observedRunningTime="2025-11-26 15:37:53.29685684 +0000 UTC m=+7641.976058668" watchObservedRunningTime="2025-11-26 15:37:53.298735211 +0000 UTC m=+7641.977937029" Nov 26 15:38:03 crc kubenswrapper[5200]: I1126 15:38:03.154271 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:38:03 crc kubenswrapper[5200]: I1126 15:38:03.154895 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:38:03 crc kubenswrapper[5200]: I1126 15:38:03.154941 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:38:03 crc kubenswrapper[5200]: I1126 15:38:03.155622 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"905a4b74f3f4f5dcadc2b620234735b5646a4c3201da89896a20ea9b3d410457"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:38:03 crc kubenswrapper[5200]: I1126 15:38:03.155669 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://905a4b74f3f4f5dcadc2b620234735b5646a4c3201da89896a20ea9b3d410457" gracePeriod=600 Nov 26 15:38:03 crc kubenswrapper[5200]: I1126 15:38:03.387784 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="905a4b74f3f4f5dcadc2b620234735b5646a4c3201da89896a20ea9b3d410457" exitCode=0 Nov 26 15:38:03 crc kubenswrapper[5200]: I1126 15:38:03.387822 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"905a4b74f3f4f5dcadc2b620234735b5646a4c3201da89896a20ea9b3d410457"} Nov 26 15:38:03 crc kubenswrapper[5200]: I1126 15:38:03.388168 5200 scope.go:117] "RemoveContainer" containerID="b02870c1bf064cfcda6a456d963ace88dbfbf5f7a2239b21a62617db4ffc389f" Nov 26 15:38:04 crc kubenswrapper[5200]: I1126 15:38:04.400526 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02"} Nov 26 15:38:08 crc kubenswrapper[5200]: I1126 15:38:08.447035 5200 generic.go:358] "Generic (PLEG): container finished" podID="4677d455-f7eb-4931-b4ea-b334c9610fac" containerID="2660cce209097985f6c8f79410cc3239646094d428bb10ff2066331630c430f2" exitCode=0 Nov 26 15:38:08 crc kubenswrapper[5200]: I1126 15:38:08.447160 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" event={"ID":"4677d455-f7eb-4931-b4ea-b334c9610fac","Type":"ContainerDied","Data":"2660cce209097985f6c8f79410cc3239646094d428bb10ff2066331630c430f2"} Nov 26 15:38:09 crc kubenswrapper[5200]: I1126 15:38:09.947530 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.021376 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ssh-key\") pod \"4677d455-f7eb-4931-b4ea-b334c9610fac\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.049371 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4677d455-f7eb-4931-b4ea-b334c9610fac" (UID: "4677d455-f7eb-4931-b4ea-b334c9610fac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.125989 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-inventory\") pod \"4677d455-f7eb-4931-b4ea-b334c9610fac\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.126228 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ceph\") pod \"4677d455-f7eb-4931-b4ea-b334c9610fac\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.128635 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjc4p\" (UniqueName: \"kubernetes.io/projected/4677d455-f7eb-4931-b4ea-b334c9610fac-kube-api-access-qjc4p\") pod \"4677d455-f7eb-4931-b4ea-b334c9610fac\" (UID: \"4677d455-f7eb-4931-b4ea-b334c9610fac\") " Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.129549 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ceph" (OuterVolumeSpecName: "ceph") pod "4677d455-f7eb-4931-b4ea-b334c9610fac" (UID: "4677d455-f7eb-4931-b4ea-b334c9610fac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.130880 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.130905 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.132927 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4677d455-f7eb-4931-b4ea-b334c9610fac-kube-api-access-qjc4p" (OuterVolumeSpecName: "kube-api-access-qjc4p") pod "4677d455-f7eb-4931-b4ea-b334c9610fac" (UID: "4677d455-f7eb-4931-b4ea-b334c9610fac"). InnerVolumeSpecName "kube-api-access-qjc4p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.161808 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-inventory" (OuterVolumeSpecName: "inventory") pod "4677d455-f7eb-4931-b4ea-b334c9610fac" (UID: "4677d455-f7eb-4931-b4ea-b334c9610fac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.233404 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qjc4p\" (UniqueName: \"kubernetes.io/projected/4677d455-f7eb-4931-b4ea-b334c9610fac-kube-api-access-qjc4p\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.233667 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4677d455-f7eb-4931-b4ea-b334c9610fac-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.471082 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" event={"ID":"4677d455-f7eb-4931-b4ea-b334c9610fac","Type":"ContainerDied","Data":"a92d0dd0b09e51d152a1b19aba3b695da1750593e4a6a0858a73b5ca35e0cbfe"} Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.471482 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a92d0dd0b09e51d152a1b19aba3b695da1750593e4a6a0858a73b5ca35e0cbfe" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.471453 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2n9fd" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.632292 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-vv4s4"] Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.633831 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4677d455-f7eb-4931-b4ea-b334c9610fac" containerName="reboot-os-openstack-openstack-cell1" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.633856 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4677d455-f7eb-4931-b4ea-b334c9610fac" containerName="reboot-os-openstack-openstack-cell1" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.634198 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4677d455-f7eb-4931-b4ea-b334c9610fac" containerName="reboot-os-openstack-openstack-cell1" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.642359 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.644697 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.644838 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.645374 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.646050 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.656454 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-vv4s4"] Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746021 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746086 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746114 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bgl6\" (UniqueName: \"kubernetes.io/projected/9817c26f-dc8b-449f-ab69-3a627bd97e69-kube-api-access-6bgl6\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746205 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746239 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746259 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746327 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746349 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746398 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ssh-key\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746443 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-inventory\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746510 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.746572 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ceph\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.848910 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.848964 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.848982 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849010 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849031 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849073 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ssh-key\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849108 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-inventory\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849131 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849169 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ceph\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849227 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849257 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.849280 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bgl6\" (UniqueName: \"kubernetes.io/projected/9817c26f-dc8b-449f-ab69-3a627bd97e69-kube-api-access-6bgl6\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.854726 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.854808 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.855186 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.855248 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.855719 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-inventory\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.856099 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.856237 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ssh-key\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.860829 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.861558 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.862476 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ceph\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.862781 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.868960 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bgl6\" (UniqueName: \"kubernetes.io/projected/9817c26f-dc8b-449f-ab69-3a627bd97e69-kube-api-access-6bgl6\") pod \"install-certs-openstack-openstack-cell1-vv4s4\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:10 crc kubenswrapper[5200]: I1126 15:38:10.964049 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:11 crc kubenswrapper[5200]: I1126 15:38:11.553387 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-vv4s4"] Nov 26 15:38:12 crc kubenswrapper[5200]: I1126 15:38:12.490732 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" event={"ID":"9817c26f-dc8b-449f-ab69-3a627bd97e69","Type":"ContainerStarted","Data":"e04a901994bc82a89c9e50f49d48546f798554ebf1554557eabf3bed9a924bee"} Nov 26 15:38:13 crc kubenswrapper[5200]: I1126 15:38:13.501813 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" event={"ID":"9817c26f-dc8b-449f-ab69-3a627bd97e69","Type":"ContainerStarted","Data":"7653f5adc636a1df8210aa3e59b6ee3e875c6d47463988fdc7eee34420920118"} Nov 26 15:38:13 crc kubenswrapper[5200]: I1126 15:38:13.537536 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" podStartSLOduration=2.806222101 podStartE2EDuration="3.537514723s" podCreationTimestamp="2025-11-26 15:38:10 +0000 UTC" firstStartedPulling="2025-11-26 15:38:11.568117873 +0000 UTC m=+7660.247319691" lastFinishedPulling="2025-11-26 15:38:12.299410495 +0000 UTC m=+7660.978612313" observedRunningTime="2025-11-26 15:38:13.526977681 +0000 UTC m=+7662.206179499" watchObservedRunningTime="2025-11-26 15:38:13.537514723 +0000 UTC m=+7662.216716541" Nov 26 15:38:30 crc kubenswrapper[5200]: I1126 15:38:30.682338 5200 generic.go:358] "Generic (PLEG): container finished" podID="9817c26f-dc8b-449f-ab69-3a627bd97e69" containerID="7653f5adc636a1df8210aa3e59b6ee3e875c6d47463988fdc7eee34420920118" exitCode=0 Nov 26 15:38:30 crc kubenswrapper[5200]: I1126 15:38:30.682389 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" event={"ID":"9817c26f-dc8b-449f-ab69-3a627bd97e69","Type":"ContainerDied","Data":"7653f5adc636a1df8210aa3e59b6ee3e875c6d47463988fdc7eee34420920118"} Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.230092 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250100 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-nova-combined-ca-bundle\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250191 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-inventory\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250253 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-metadata-combined-ca-bundle\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250359 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ovn-combined-ca-bundle\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250402 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-dhcp-combined-ca-bundle\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250533 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-libvirt-combined-ca-bundle\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250589 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ceph\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250683 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-sriov-combined-ca-bundle\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250817 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bgl6\" (UniqueName: \"kubernetes.io/projected/9817c26f-dc8b-449f-ab69-3a627bd97e69-kube-api-access-6bgl6\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.250855 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-telemetry-combined-ca-bundle\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.251057 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-bootstrap-combined-ca-bundle\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.251267 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ssh-key\") pod \"9817c26f-dc8b-449f-ab69-3a627bd97e69\" (UID: \"9817c26f-dc8b-449f-ab69-3a627bd97e69\") " Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.260420 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ceph" (OuterVolumeSpecName: "ceph") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.261277 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.261538 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.261918 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9817c26f-dc8b-449f-ab69-3a627bd97e69-kube-api-access-6bgl6" (OuterVolumeSpecName: "kube-api-access-6bgl6") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "kube-api-access-6bgl6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.262070 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.262488 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.268402 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.276632 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.281520 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.287373 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.295837 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.306722 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-inventory" (OuterVolumeSpecName: "inventory") pod "9817c26f-dc8b-449f-ab69-3a627bd97e69" (UID: "9817c26f-dc8b-449f-ab69-3a627bd97e69"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.354808 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.354849 5200 reconciler_common.go:299] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.354863 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.354876 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.354888 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.354899 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.354909 5200 reconciler_common.go:299] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.355073 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.355083 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.355091 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6bgl6\" (UniqueName: \"kubernetes.io/projected/9817c26f-dc8b-449f-ab69-3a627bd97e69-kube-api-access-6bgl6\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.355100 5200 reconciler_common.go:299] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.355108 5200 reconciler_common.go:299] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9817c26f-dc8b-449f-ab69-3a627bd97e69-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.713930 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" event={"ID":"9817c26f-dc8b-449f-ab69-3a627bd97e69","Type":"ContainerDied","Data":"e04a901994bc82a89c9e50f49d48546f798554ebf1554557eabf3bed9a924bee"} Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.713982 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e04a901994bc82a89c9e50f49d48546f798554ebf1554557eabf3bed9a924bee" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.714005 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-vv4s4" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.813675 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-8x46w"] Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.816010 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9817c26f-dc8b-449f-ab69-3a627bd97e69" containerName="install-certs-openstack-openstack-cell1" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.816044 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9817c26f-dc8b-449f-ab69-3a627bd97e69" containerName="install-certs-openstack-openstack-cell1" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.816463 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9817c26f-dc8b-449f-ab69-3a627bd97e69" containerName="install-certs-openstack-openstack-cell1" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.827172 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.828078 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-8x46w"] Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.829496 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.829959 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.830115 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.830253 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.866735 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.866888 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ceph\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.867133 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-inventory\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.867438 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlzff\" (UniqueName: \"kubernetes.io/projected/23d9ad7b-8d47-49ce-90b8-698c302994f8-kube-api-access-wlzff\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.969719 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.969918 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ceph\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.970009 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-inventory\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.970165 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlzff\" (UniqueName: \"kubernetes.io/projected/23d9ad7b-8d47-49ce-90b8-698c302994f8-kube-api-access-wlzff\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.976663 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.977526 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ceph\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.982962 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-inventory\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:32 crc kubenswrapper[5200]: I1126 15:38:32.997527 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlzff\" (UniqueName: \"kubernetes.io/projected/23d9ad7b-8d47-49ce-90b8-698c302994f8-kube-api-access-wlzff\") pod \"ceph-client-openstack-openstack-cell1-8x46w\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:33 crc kubenswrapper[5200]: I1126 15:38:33.147742 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:33 crc kubenswrapper[5200]: I1126 15:38:33.683525 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-8x46w"] Nov 26 15:38:33 crc kubenswrapper[5200]: I1126 15:38:33.724061 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" event={"ID":"23d9ad7b-8d47-49ce-90b8-698c302994f8","Type":"ContainerStarted","Data":"c70041034f90cd84ec78130e625e60af25c59110885f09f55bee72fb31e515b7"} Nov 26 15:38:34 crc kubenswrapper[5200]: I1126 15:38:34.746713 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" event={"ID":"23d9ad7b-8d47-49ce-90b8-698c302994f8","Type":"ContainerStarted","Data":"c78735e94b7ea716767c8916a60b6495799c82c6e773b663cb0d9d36f62d76ae"} Nov 26 15:38:34 crc kubenswrapper[5200]: I1126 15:38:34.774172 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" podStartSLOduration=2.267247666 podStartE2EDuration="2.774156694s" podCreationTimestamp="2025-11-26 15:38:32 +0000 UTC" firstStartedPulling="2025-11-26 15:38:33.683466097 +0000 UTC m=+7682.362667915" lastFinishedPulling="2025-11-26 15:38:34.190375125 +0000 UTC m=+7682.869576943" observedRunningTime="2025-11-26 15:38:34.773327531 +0000 UTC m=+7683.452529369" watchObservedRunningTime="2025-11-26 15:38:34.774156694 +0000 UTC m=+7683.453358502" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.228420 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ftdl9"] Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.238866 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.243976 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftdl9"] Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.351522 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-catalog-content\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.351873 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-utilities\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.351993 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8clwg\" (UniqueName: \"kubernetes.io/projected/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-kube-api-access-8clwg\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.454668 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-catalog-content\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.454774 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-utilities\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.454811 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8clwg\" (UniqueName: \"kubernetes.io/projected/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-kube-api-access-8clwg\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.455291 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-catalog-content\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.455430 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-utilities\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.481226 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8clwg\" (UniqueName: \"kubernetes.io/projected/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-kube-api-access-8clwg\") pod \"certified-operators-ftdl9\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:35 crc kubenswrapper[5200]: I1126 15:38:35.569945 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:36 crc kubenswrapper[5200]: I1126 15:38:36.091681 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftdl9"] Nov 26 15:38:36 crc kubenswrapper[5200]: I1126 15:38:36.766219 5200 generic.go:358] "Generic (PLEG): container finished" podID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerID="d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1" exitCode=0 Nov 26 15:38:36 crc kubenswrapper[5200]: I1126 15:38:36.766260 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftdl9" event={"ID":"3579016f-2b4e-43f7-a38d-f39ecfc7ec17","Type":"ContainerDied","Data":"d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1"} Nov 26 15:38:36 crc kubenswrapper[5200]: I1126 15:38:36.766587 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftdl9" event={"ID":"3579016f-2b4e-43f7-a38d-f39ecfc7ec17","Type":"ContainerStarted","Data":"1d652dc4100ef891471ae138cbd429890ea7aa5265605f2a64217631f438c175"} Nov 26 15:38:37 crc kubenswrapper[5200]: I1126 15:38:37.777401 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftdl9" event={"ID":"3579016f-2b4e-43f7-a38d-f39ecfc7ec17","Type":"ContainerStarted","Data":"924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2"} Nov 26 15:38:38 crc kubenswrapper[5200]: I1126 15:38:38.788810 5200 generic.go:358] "Generic (PLEG): container finished" podID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerID="924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2" exitCode=0 Nov 26 15:38:38 crc kubenswrapper[5200]: I1126 15:38:38.788864 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftdl9" event={"ID":"3579016f-2b4e-43f7-a38d-f39ecfc7ec17","Type":"ContainerDied","Data":"924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2"} Nov 26 15:38:39 crc kubenswrapper[5200]: I1126 15:38:39.801947 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftdl9" event={"ID":"3579016f-2b4e-43f7-a38d-f39ecfc7ec17","Type":"ContainerStarted","Data":"4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b"} Nov 26 15:38:39 crc kubenswrapper[5200]: I1126 15:38:39.804901 5200 generic.go:358] "Generic (PLEG): container finished" podID="23d9ad7b-8d47-49ce-90b8-698c302994f8" containerID="c78735e94b7ea716767c8916a60b6495799c82c6e773b663cb0d9d36f62d76ae" exitCode=0 Nov 26 15:38:39 crc kubenswrapper[5200]: I1126 15:38:39.804968 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" event={"ID":"23d9ad7b-8d47-49ce-90b8-698c302994f8","Type":"ContainerDied","Data":"c78735e94b7ea716767c8916a60b6495799c82c6e773b663cb0d9d36f62d76ae"} Nov 26 15:38:39 crc kubenswrapper[5200]: I1126 15:38:39.828280 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ftdl9" podStartSLOduration=4.236900127 podStartE2EDuration="4.828257459s" podCreationTimestamp="2025-11-26 15:38:35 +0000 UTC" firstStartedPulling="2025-11-26 15:38:36.767610708 +0000 UTC m=+7685.446812526" lastFinishedPulling="2025-11-26 15:38:37.35896804 +0000 UTC m=+7686.038169858" observedRunningTime="2025-11-26 15:38:39.820194653 +0000 UTC m=+7688.499396481" watchObservedRunningTime="2025-11-26 15:38:39.828257459 +0000 UTC m=+7688.507459277" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.254169 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.286839 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlzff\" (UniqueName: \"kubernetes.io/projected/23d9ad7b-8d47-49ce-90b8-698c302994f8-kube-api-access-wlzff\") pod \"23d9ad7b-8d47-49ce-90b8-698c302994f8\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.286949 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ssh-key\") pod \"23d9ad7b-8d47-49ce-90b8-698c302994f8\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.287101 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ceph\") pod \"23d9ad7b-8d47-49ce-90b8-698c302994f8\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.287232 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-inventory\") pod \"23d9ad7b-8d47-49ce-90b8-698c302994f8\" (UID: \"23d9ad7b-8d47-49ce-90b8-698c302994f8\") " Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.296446 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ceph" (OuterVolumeSpecName: "ceph") pod "23d9ad7b-8d47-49ce-90b8-698c302994f8" (UID: "23d9ad7b-8d47-49ce-90b8-698c302994f8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.298388 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d9ad7b-8d47-49ce-90b8-698c302994f8-kube-api-access-wlzff" (OuterVolumeSpecName: "kube-api-access-wlzff") pod "23d9ad7b-8d47-49ce-90b8-698c302994f8" (UID: "23d9ad7b-8d47-49ce-90b8-698c302994f8"). InnerVolumeSpecName "kube-api-access-wlzff". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.321918 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-inventory" (OuterVolumeSpecName: "inventory") pod "23d9ad7b-8d47-49ce-90b8-698c302994f8" (UID: "23d9ad7b-8d47-49ce-90b8-698c302994f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.338866 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "23d9ad7b-8d47-49ce-90b8-698c302994f8" (UID: "23d9ad7b-8d47-49ce-90b8-698c302994f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.390235 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.390274 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wlzff\" (UniqueName: \"kubernetes.io/projected/23d9ad7b-8d47-49ce-90b8-698c302994f8-kube-api-access-wlzff\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.390284 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.390293 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23d9ad7b-8d47-49ce-90b8-698c302994f8-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.825196 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.825211 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-8x46w" event={"ID":"23d9ad7b-8d47-49ce-90b8-698c302994f8","Type":"ContainerDied","Data":"c70041034f90cd84ec78130e625e60af25c59110885f09f55bee72fb31e515b7"} Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.825261 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c70041034f90cd84ec78130e625e60af25c59110885f09f55bee72fb31e515b7" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.931320 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-62d9h"] Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.932521 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23d9ad7b-8d47-49ce-90b8-698c302994f8" containerName="ceph-client-openstack-openstack-cell1" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.932541 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d9ad7b-8d47-49ce-90b8-698c302994f8" containerName="ceph-client-openstack-openstack-cell1" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.932836 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="23d9ad7b-8d47-49ce-90b8-698c302994f8" containerName="ceph-client-openstack-openstack-cell1" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.941291 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.944327 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.944333 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.944410 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.945019 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-config\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.945321 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:38:41 crc kubenswrapper[5200]: I1126 15:38:41.952887 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-62d9h"] Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.004787 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9c463695-8a19-4649-bbc3-85c2c13225fe-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.004848 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ssh-key\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.004949 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mc5l\" (UniqueName: \"kubernetes.io/projected/9c463695-8a19-4649-bbc3-85c2c13225fe-kube-api-access-6mc5l\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.005108 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.005411 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-inventory\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.005549 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ceph\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.108736 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ceph\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.109347 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9c463695-8a19-4649-bbc3-85c2c13225fe-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.109377 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ssh-key\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.109434 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mc5l\" (UniqueName: \"kubernetes.io/projected/9c463695-8a19-4649-bbc3-85c2c13225fe-kube-api-access-6mc5l\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.109534 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.109608 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-inventory\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.110986 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9c463695-8a19-4649-bbc3-85c2c13225fe-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.116423 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ssh-key\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.116446 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ceph\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.116539 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-inventory\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.125427 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.128288 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mc5l\" (UniqueName: \"kubernetes.io/projected/9c463695-8a19-4649-bbc3-85c2c13225fe-kube-api-access-6mc5l\") pod \"ovn-openstack-openstack-cell1-62d9h\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.266592 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:38:42 crc kubenswrapper[5200]: E1126 15:38:42.879669 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2444481 actualBytes=10240 Nov 26 15:38:42 crc kubenswrapper[5200]: W1126 15:38:42.891006 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c463695_8a19_4649_bbc3_85c2c13225fe.slice/crio-d5d39aa601c78739dd10ef2581b728a39481d4bf825124ae91c561797858c6d0 WatchSource:0}: Error finding container d5d39aa601c78739dd10ef2581b728a39481d4bf825124ae91c561797858c6d0: Status 404 returned error can't find the container with id d5d39aa601c78739dd10ef2581b728a39481d4bf825124ae91c561797858c6d0 Nov 26 15:38:42 crc kubenswrapper[5200]: I1126 15:38:42.898032 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-62d9h"] Nov 26 15:38:43 crc kubenswrapper[5200]: I1126 15:38:43.846183 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-62d9h" event={"ID":"9c463695-8a19-4649-bbc3-85c2c13225fe","Type":"ContainerStarted","Data":"fddb135d985f467d339798d7ac4f066129efa04803d8782ed9d234ef10d9cc45"} Nov 26 15:38:43 crc kubenswrapper[5200]: I1126 15:38:43.846832 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-62d9h" event={"ID":"9c463695-8a19-4649-bbc3-85c2c13225fe","Type":"ContainerStarted","Data":"d5d39aa601c78739dd10ef2581b728a39481d4bf825124ae91c561797858c6d0"} Nov 26 15:38:43 crc kubenswrapper[5200]: I1126 15:38:43.869557 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-62d9h" podStartSLOduration=2.380926679 podStartE2EDuration="2.869532456s" podCreationTimestamp="2025-11-26 15:38:41 +0000 UTC" firstStartedPulling="2025-11-26 15:38:42.892785234 +0000 UTC m=+7691.571987052" lastFinishedPulling="2025-11-26 15:38:43.381391011 +0000 UTC m=+7692.060592829" observedRunningTime="2025-11-26 15:38:43.86743811 +0000 UTC m=+7692.546639948" watchObservedRunningTime="2025-11-26 15:38:43.869532456 +0000 UTC m=+7692.548734274" Nov 26 15:38:45 crc kubenswrapper[5200]: I1126 15:38:45.570307 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:45 crc kubenswrapper[5200]: I1126 15:38:45.571069 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:45 crc kubenswrapper[5200]: I1126 15:38:45.615865 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:45 crc kubenswrapper[5200]: I1126 15:38:45.916172 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:45 crc kubenswrapper[5200]: I1126 15:38:45.962998 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftdl9"] Nov 26 15:38:47 crc kubenswrapper[5200]: I1126 15:38:47.886423 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ftdl9" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerName="registry-server" containerID="cri-o://4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b" gracePeriod=2 Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.380595 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.568582 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8clwg\" (UniqueName: \"kubernetes.io/projected/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-kube-api-access-8clwg\") pod \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.568875 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-utilities\") pod \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.569279 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-catalog-content\") pod \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\" (UID: \"3579016f-2b4e-43f7-a38d-f39ecfc7ec17\") " Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.569907 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-utilities" (OuterVolumeSpecName: "utilities") pod "3579016f-2b4e-43f7-a38d-f39ecfc7ec17" (UID: "3579016f-2b4e-43f7-a38d-f39ecfc7ec17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.579193 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-kube-api-access-8clwg" (OuterVolumeSpecName: "kube-api-access-8clwg") pod "3579016f-2b4e-43f7-a38d-f39ecfc7ec17" (UID: "3579016f-2b4e-43f7-a38d-f39ecfc7ec17"). InnerVolumeSpecName "kube-api-access-8clwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.599591 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3579016f-2b4e-43f7-a38d-f39ecfc7ec17" (UID: "3579016f-2b4e-43f7-a38d-f39ecfc7ec17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.672565 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.672861 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8clwg\" (UniqueName: \"kubernetes.io/projected/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-kube-api-access-8clwg\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.672969 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3579016f-2b4e-43f7-a38d-f39ecfc7ec17-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.897530 5200 generic.go:358] "Generic (PLEG): container finished" podID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerID="4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b" exitCode=0 Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.897607 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftdl9" event={"ID":"3579016f-2b4e-43f7-a38d-f39ecfc7ec17","Type":"ContainerDied","Data":"4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b"} Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.897639 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftdl9" event={"ID":"3579016f-2b4e-43f7-a38d-f39ecfc7ec17","Type":"ContainerDied","Data":"1d652dc4100ef891471ae138cbd429890ea7aa5265605f2a64217631f438c175"} Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.897700 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftdl9" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.897667 5200 scope.go:117] "RemoveContainer" containerID="4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.925660 5200 scope.go:117] "RemoveContainer" containerID="924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.944800 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftdl9"] Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.954944 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ftdl9"] Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.966523 5200 scope.go:117] "RemoveContainer" containerID="d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1" Nov 26 15:38:48 crc kubenswrapper[5200]: I1126 15:38:48.982029 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" path="/var/lib/kubelet/pods/3579016f-2b4e-43f7-a38d-f39ecfc7ec17/volumes" Nov 26 15:38:49 crc kubenswrapper[5200]: I1126 15:38:49.006421 5200 scope.go:117] "RemoveContainer" containerID="4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b" Nov 26 15:38:49 crc kubenswrapper[5200]: E1126 15:38:49.007023 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b\": container with ID starting with 4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b not found: ID does not exist" containerID="4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b" Nov 26 15:38:49 crc kubenswrapper[5200]: I1126 15:38:49.007157 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b"} err="failed to get container status \"4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b\": rpc error: code = NotFound desc = could not find container \"4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b\": container with ID starting with 4d13e87ae7270bac64d628e791a2801c6a558794f3c346d769064932741cc13b not found: ID does not exist" Nov 26 15:38:49 crc kubenswrapper[5200]: I1126 15:38:49.007263 5200 scope.go:117] "RemoveContainer" containerID="924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2" Nov 26 15:38:49 crc kubenswrapper[5200]: E1126 15:38:49.007874 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2\": container with ID starting with 924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2 not found: ID does not exist" containerID="924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2" Nov 26 15:38:49 crc kubenswrapper[5200]: I1126 15:38:49.007912 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2"} err="failed to get container status \"924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2\": rpc error: code = NotFound desc = could not find container \"924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2\": container with ID starting with 924e5dcf25fe6c1d4f6f0d605f5599c6aae0fd7031f9466b04299024dcd084f2 not found: ID does not exist" Nov 26 15:38:49 crc kubenswrapper[5200]: I1126 15:38:49.007935 5200 scope.go:117] "RemoveContainer" containerID="d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1" Nov 26 15:38:49 crc kubenswrapper[5200]: E1126 15:38:49.008263 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1\": container with ID starting with d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1 not found: ID does not exist" containerID="d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1" Nov 26 15:38:49 crc kubenswrapper[5200]: I1126 15:38:49.008323 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1"} err="failed to get container status \"d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1\": rpc error: code = NotFound desc = could not find container \"d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1\": container with ID starting with d07f61f16269db1138576495e003142295b9084cec512883d22dc47490c25ee1 not found: ID does not exist" Nov 26 15:39:42 crc kubenswrapper[5200]: E1126 15:39:42.452575 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443080 actualBytes=10240 Nov 26 15:39:48 crc kubenswrapper[5200]: I1126 15:39:48.508278 5200 generic.go:358] "Generic (PLEG): container finished" podID="9c463695-8a19-4649-bbc3-85c2c13225fe" containerID="fddb135d985f467d339798d7ac4f066129efa04803d8782ed9d234ef10d9cc45" exitCode=0 Nov 26 15:39:48 crc kubenswrapper[5200]: I1126 15:39:48.508369 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-62d9h" event={"ID":"9c463695-8a19-4649-bbc3-85c2c13225fe","Type":"ContainerDied","Data":"fddb135d985f467d339798d7ac4f066129efa04803d8782ed9d234ef10d9cc45"} Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.026623 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.079093 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ssh-key\") pod \"9c463695-8a19-4649-bbc3-85c2c13225fe\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.079255 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9c463695-8a19-4649-bbc3-85c2c13225fe-ovncontroller-config-0\") pod \"9c463695-8a19-4649-bbc3-85c2c13225fe\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.079429 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ceph\") pod \"9c463695-8a19-4649-bbc3-85c2c13225fe\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.079585 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ovn-combined-ca-bundle\") pod \"9c463695-8a19-4649-bbc3-85c2c13225fe\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.080012 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-inventory\") pod \"9c463695-8a19-4649-bbc3-85c2c13225fe\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.080087 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mc5l\" (UniqueName: \"kubernetes.io/projected/9c463695-8a19-4649-bbc3-85c2c13225fe-kube-api-access-6mc5l\") pod \"9c463695-8a19-4649-bbc3-85c2c13225fe\" (UID: \"9c463695-8a19-4649-bbc3-85c2c13225fe\") " Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.086671 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c463695-8a19-4649-bbc3-85c2c13225fe-kube-api-access-6mc5l" (OuterVolumeSpecName: "kube-api-access-6mc5l") pod "9c463695-8a19-4649-bbc3-85c2c13225fe" (UID: "9c463695-8a19-4649-bbc3-85c2c13225fe"). InnerVolumeSpecName "kube-api-access-6mc5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.086990 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ceph" (OuterVolumeSpecName: "ceph") pod "9c463695-8a19-4649-bbc3-85c2c13225fe" (UID: "9c463695-8a19-4649-bbc3-85c2c13225fe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.087335 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9c463695-8a19-4649-bbc3-85c2c13225fe" (UID: "9c463695-8a19-4649-bbc3-85c2c13225fe"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.112811 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-inventory" (OuterVolumeSpecName: "inventory") pod "9c463695-8a19-4649-bbc3-85c2c13225fe" (UID: "9c463695-8a19-4649-bbc3-85c2c13225fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.113152 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c463695-8a19-4649-bbc3-85c2c13225fe-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "9c463695-8a19-4649-bbc3-85c2c13225fe" (UID: "9c463695-8a19-4649-bbc3-85c2c13225fe"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.125639 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c463695-8a19-4649-bbc3-85c2c13225fe" (UID: "9c463695-8a19-4649-bbc3-85c2c13225fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.182703 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.182738 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6mc5l\" (UniqueName: \"kubernetes.io/projected/9c463695-8a19-4649-bbc3-85c2c13225fe-kube-api-access-6mc5l\") on node \"crc\" DevicePath \"\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.182751 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.182761 5200 reconciler_common.go:299] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9c463695-8a19-4649-bbc3-85c2c13225fe-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.182771 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.182782 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c463695-8a19-4649-bbc3-85c2c13225fe-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.549565 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-62d9h" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.549626 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-62d9h" event={"ID":"9c463695-8a19-4649-bbc3-85c2c13225fe","Type":"ContainerDied","Data":"d5d39aa601c78739dd10ef2581b728a39481d4bf825124ae91c561797858c6d0"} Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.549663 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5d39aa601c78739dd10ef2581b728a39481d4bf825124ae91c561797858c6d0" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.617396 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-r7nwj"] Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619278 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerName="extract-content" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619314 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerName="extract-content" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619328 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerName="registry-server" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619336 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerName="registry-server" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619352 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerName="extract-utilities" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619359 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerName="extract-utilities" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619440 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c463695-8a19-4649-bbc3-85c2c13225fe" containerName="ovn-openstack-openstack-cell1" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619451 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c463695-8a19-4649-bbc3-85c2c13225fe" containerName="ovn-openstack-openstack-cell1" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619738 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3579016f-2b4e-43f7-a38d-f39ecfc7ec17" containerName="registry-server" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.619780 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c463695-8a19-4649-bbc3-85c2c13225fe" containerName="ovn-openstack-openstack-cell1" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.627204 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.629786 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.629981 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.630356 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-neutron-config\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.630659 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.632156 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.632412 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-ovn-metadata-agent-neutron-config\"" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.634497 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-r7nwj"] Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.695451 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84f26\" (UniqueName: \"kubernetes.io/projected/56520482-0e09-42f1-be88-5e063facd0d1-kube-api-access-84f26\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.695502 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.695549 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.695699 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.695905 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.696293 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.696456 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.799039 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.799102 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.799180 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84f26\" (UniqueName: \"kubernetes.io/projected/56520482-0e09-42f1-be88-5e063facd0d1-kube-api-access-84f26\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.799210 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.799252 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.799291 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.799333 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.805440 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.806382 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.806841 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.806946 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.807563 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.812187 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.819111 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84f26\" (UniqueName: \"kubernetes.io/projected/56520482-0e09-42f1-be88-5e063facd0d1-kube-api-access-84f26\") pod \"neutron-metadata-openstack-openstack-cell1-r7nwj\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:50 crc kubenswrapper[5200]: I1126 15:39:50.964071 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:39:51 crc kubenswrapper[5200]: I1126 15:39:51.488997 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-r7nwj"] Nov 26 15:39:51 crc kubenswrapper[5200]: I1126 15:39:51.561494 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" event={"ID":"56520482-0e09-42f1-be88-5e063facd0d1","Type":"ContainerStarted","Data":"9b6e25335f18d12753dc58474576294999bc58da5b9aece7a4e51c6afd441308"} Nov 26 15:39:52 crc kubenswrapper[5200]: I1126 15:39:52.580719 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" event={"ID":"56520482-0e09-42f1-be88-5e063facd0d1","Type":"ContainerStarted","Data":"4608ed70b0cf2de8d6ff692bf015c45bd890f2b2d9d13f16ea26a1fcd5205673"} Nov 26 15:39:52 crc kubenswrapper[5200]: I1126 15:39:52.615650 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" podStartSLOduration=1.9198125510000001 podStartE2EDuration="2.615630183s" podCreationTimestamp="2025-11-26 15:39:50 +0000 UTC" firstStartedPulling="2025-11-26 15:39:51.491059148 +0000 UTC m=+7760.170260966" lastFinishedPulling="2025-11-26 15:39:52.18687678 +0000 UTC m=+7760.866078598" observedRunningTime="2025-11-26 15:39:52.606427366 +0000 UTC m=+7761.285629204" watchObservedRunningTime="2025-11-26 15:39:52.615630183 +0000 UTC m=+7761.294832001" Nov 26 15:40:03 crc kubenswrapper[5200]: I1126 15:40:03.155113 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:40:03 crc kubenswrapper[5200]: I1126 15:40:03.156040 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:40:33 crc kubenswrapper[5200]: I1126 15:40:33.154898 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:40:33 crc kubenswrapper[5200]: I1126 15:40:33.155488 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:40:38 crc kubenswrapper[5200]: I1126 15:40:38.525014 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:40:38 crc kubenswrapper[5200]: I1126 15:40:38.530389 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:40:38 crc kubenswrapper[5200]: I1126 15:40:38.532459 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:40:38 crc kubenswrapper[5200]: I1126 15:40:38.537056 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:40:42 crc kubenswrapper[5200]: E1126 15:40:42.561778 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441886 actualBytes=10240 Nov 26 15:40:47 crc kubenswrapper[5200]: I1126 15:40:47.164992 5200 generic.go:358] "Generic (PLEG): container finished" podID="56520482-0e09-42f1-be88-5e063facd0d1" containerID="4608ed70b0cf2de8d6ff692bf015c45bd890f2b2d9d13f16ea26a1fcd5205673" exitCode=0 Nov 26 15:40:47 crc kubenswrapper[5200]: I1126 15:40:47.165049 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" event={"ID":"56520482-0e09-42f1-be88-5e063facd0d1","Type":"ContainerDied","Data":"4608ed70b0cf2de8d6ff692bf015c45bd890f2b2d9d13f16ea26a1fcd5205673"} Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.638587 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.797794 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ssh-key\") pod \"56520482-0e09-42f1-be88-5e063facd0d1\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.797906 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"56520482-0e09-42f1-be88-5e063facd0d1\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.798001 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ceph\") pod \"56520482-0e09-42f1-be88-5e063facd0d1\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.798185 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-nova-metadata-neutron-config-0\") pod \"56520482-0e09-42f1-be88-5e063facd0d1\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.798272 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-metadata-combined-ca-bundle\") pod \"56520482-0e09-42f1-be88-5e063facd0d1\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.798440 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84f26\" (UniqueName: \"kubernetes.io/projected/56520482-0e09-42f1-be88-5e063facd0d1-kube-api-access-84f26\") pod \"56520482-0e09-42f1-be88-5e063facd0d1\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.798518 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-inventory\") pod \"56520482-0e09-42f1-be88-5e063facd0d1\" (UID: \"56520482-0e09-42f1-be88-5e063facd0d1\") " Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.804335 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "56520482-0e09-42f1-be88-5e063facd0d1" (UID: "56520482-0e09-42f1-be88-5e063facd0d1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.804374 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ceph" (OuterVolumeSpecName: "ceph") pod "56520482-0e09-42f1-be88-5e063facd0d1" (UID: "56520482-0e09-42f1-be88-5e063facd0d1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.805415 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56520482-0e09-42f1-be88-5e063facd0d1-kube-api-access-84f26" (OuterVolumeSpecName: "kube-api-access-84f26") pod "56520482-0e09-42f1-be88-5e063facd0d1" (UID: "56520482-0e09-42f1-be88-5e063facd0d1"). InnerVolumeSpecName "kube-api-access-84f26". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.828072 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "56520482-0e09-42f1-be88-5e063facd0d1" (UID: "56520482-0e09-42f1-be88-5e063facd0d1"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.828865 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "56520482-0e09-42f1-be88-5e063facd0d1" (UID: "56520482-0e09-42f1-be88-5e063facd0d1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.843747 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "56520482-0e09-42f1-be88-5e063facd0d1" (UID: "56520482-0e09-42f1-be88-5e063facd0d1"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.845468 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-inventory" (OuterVolumeSpecName: "inventory") pod "56520482-0e09-42f1-be88-5e063facd0d1" (UID: "56520482-0e09-42f1-be88-5e063facd0d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.901199 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.901229 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.901240 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.901250 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.901260 5200 reconciler_common.go:299] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.901270 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56520482-0e09-42f1-be88-5e063facd0d1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:40:48 crc kubenswrapper[5200]: I1126 15:40:48.901279 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-84f26\" (UniqueName: \"kubernetes.io/projected/56520482-0e09-42f1-be88-5e063facd0d1-kube-api-access-84f26\") on node \"crc\" DevicePath \"\"" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.187917 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.187919 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-r7nwj" event={"ID":"56520482-0e09-42f1-be88-5e063facd0d1","Type":"ContainerDied","Data":"9b6e25335f18d12753dc58474576294999bc58da5b9aece7a4e51c6afd441308"} Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.188034 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b6e25335f18d12753dc58474576294999bc58da5b9aece7a4e51c6afd441308" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.270393 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-m9k4v"] Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.271853 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56520482-0e09-42f1-be88-5e063facd0d1" containerName="neutron-metadata-openstack-openstack-cell1" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.271884 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="56520482-0e09-42f1-be88-5e063facd0d1" containerName="neutron-metadata-openstack-openstack-cell1" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.272094 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="56520482-0e09-42f1-be88-5e063facd0d1" containerName="neutron-metadata-openstack-openstack-cell1" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.277895 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.279633 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-m9k4v"] Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.282365 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"libvirt-secret\"" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.282552 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.282707 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.282824 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.282968 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.420151 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.420489 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ftgf\" (UniqueName: \"kubernetes.io/projected/b36710e1-d61f-4fcc-9c44-d1298dedac90-kube-api-access-2ftgf\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.420518 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.420809 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ssh-key\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.420919 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ceph\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.421005 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-inventory\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.522777 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ceph\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.522861 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-inventory\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.523886 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.524187 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ftgf\" (UniqueName: \"kubernetes.io/projected/b36710e1-d61f-4fcc-9c44-d1298dedac90-kube-api-access-2ftgf\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.524233 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.524334 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ssh-key\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.528328 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-inventory\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.528322 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.528798 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ceph\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.530355 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ssh-key\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.531016 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.541673 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ftgf\" (UniqueName: \"kubernetes.io/projected/b36710e1-d61f-4fcc-9c44-d1298dedac90-kube-api-access-2ftgf\") pod \"libvirt-openstack-openstack-cell1-m9k4v\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:49 crc kubenswrapper[5200]: I1126 15:40:49.632915 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:40:50 crc kubenswrapper[5200]: I1126 15:40:50.136801 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-m9k4v"] Nov 26 15:40:50 crc kubenswrapper[5200]: I1126 15:40:50.198239 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" event={"ID":"b36710e1-d61f-4fcc-9c44-d1298dedac90","Type":"ContainerStarted","Data":"410a82a7082c0df4e99531066e29554913eb2bd16d7868ee179bcfb92b2a92af"} Nov 26 15:40:51 crc kubenswrapper[5200]: I1126 15:40:51.217740 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" event={"ID":"b36710e1-d61f-4fcc-9c44-d1298dedac90","Type":"ContainerStarted","Data":"1657627c9a0ec245b637864098ce3d223488d0665ad04a9190f8f1b138ab4502"} Nov 26 15:40:51 crc kubenswrapper[5200]: I1126 15:40:51.248442 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" podStartSLOduration=1.724750612 podStartE2EDuration="2.248420252s" podCreationTimestamp="2025-11-26 15:40:49 +0000 UTC" firstStartedPulling="2025-11-26 15:40:50.145031552 +0000 UTC m=+7818.824233370" lastFinishedPulling="2025-11-26 15:40:50.668701192 +0000 UTC m=+7819.347903010" observedRunningTime="2025-11-26 15:40:51.234265081 +0000 UTC m=+7819.913466919" watchObservedRunningTime="2025-11-26 15:40:51.248420252 +0000 UTC m=+7819.927622080" Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.154878 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.155548 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.155610 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.158181 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.158250 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" gracePeriod=600 Nov 26 15:41:03 crc kubenswrapper[5200]: E1126 15:41:03.282308 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.347713 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" exitCode=0 Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.347794 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02"} Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.347847 5200 scope.go:117] "RemoveContainer" containerID="905a4b74f3f4f5dcadc2b620234735b5646a4c3201da89896a20ea9b3d410457" Nov 26 15:41:03 crc kubenswrapper[5200]: I1126 15:41:03.348615 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:41:03 crc kubenswrapper[5200]: E1126 15:41:03.349049 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:41:15 crc kubenswrapper[5200]: I1126 15:41:15.969403 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:41:15 crc kubenswrapper[5200]: E1126 15:41:15.970343 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:41:30 crc kubenswrapper[5200]: I1126 15:41:30.981479 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:41:30 crc kubenswrapper[5200]: E1126 15:41:30.982364 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:41:41 crc kubenswrapper[5200]: I1126 15:41:41.970118 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:41:41 crc kubenswrapper[5200]: E1126 15:41:41.970925 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:41:42 crc kubenswrapper[5200]: E1126 15:41:42.347555 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441783 actualBytes=10240 Nov 26 15:41:55 crc kubenswrapper[5200]: I1126 15:41:55.970277 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:41:55 crc kubenswrapper[5200]: E1126 15:41:55.971451 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.200964 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-76zp7"] Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.212291 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.223805 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76zp7"] Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.336022 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-utilities\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.336134 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-catalog-content\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.336474 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxc7g\" (UniqueName: \"kubernetes.io/projected/3268a317-1d01-4d53-a1ab-5f6d75932369-kube-api-access-bxc7g\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.438784 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxc7g\" (UniqueName: \"kubernetes.io/projected/3268a317-1d01-4d53-a1ab-5f6d75932369-kube-api-access-bxc7g\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.438994 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-utilities\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.439063 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-catalog-content\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.439593 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-utilities\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.439624 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-catalog-content\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.460496 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxc7g\" (UniqueName: \"kubernetes.io/projected/3268a317-1d01-4d53-a1ab-5f6d75932369-kube-api-access-bxc7g\") pod \"community-operators-76zp7\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:01 crc kubenswrapper[5200]: I1126 15:42:01.538457 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:02 crc kubenswrapper[5200]: I1126 15:42:02.072299 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76zp7"] Nov 26 15:42:02 crc kubenswrapper[5200]: I1126 15:42:02.110566 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76zp7" event={"ID":"3268a317-1d01-4d53-a1ab-5f6d75932369","Type":"ContainerStarted","Data":"6760a79dbeb57c9c49cce607a53e18834a401699828c4412ce9d037441e35d27"} Nov 26 15:42:03 crc kubenswrapper[5200]: I1126 15:42:03.125567 5200 generic.go:358] "Generic (PLEG): container finished" podID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerID="18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15" exitCode=0 Nov 26 15:42:03 crc kubenswrapper[5200]: I1126 15:42:03.125723 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76zp7" event={"ID":"3268a317-1d01-4d53-a1ab-5f6d75932369","Type":"ContainerDied","Data":"18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15"} Nov 26 15:42:04 crc kubenswrapper[5200]: I1126 15:42:04.139322 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76zp7" event={"ID":"3268a317-1d01-4d53-a1ab-5f6d75932369","Type":"ContainerStarted","Data":"3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a"} Nov 26 15:42:05 crc kubenswrapper[5200]: I1126 15:42:05.150807 5200 generic.go:358] "Generic (PLEG): container finished" podID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerID="3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a" exitCode=0 Nov 26 15:42:05 crc kubenswrapper[5200]: I1126 15:42:05.150906 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76zp7" event={"ID":"3268a317-1d01-4d53-a1ab-5f6d75932369","Type":"ContainerDied","Data":"3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a"} Nov 26 15:42:06 crc kubenswrapper[5200]: I1126 15:42:06.163370 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76zp7" event={"ID":"3268a317-1d01-4d53-a1ab-5f6d75932369","Type":"ContainerStarted","Data":"8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072"} Nov 26 15:42:06 crc kubenswrapper[5200]: I1126 15:42:06.185752 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-76zp7" podStartSLOduration=4.415967386 podStartE2EDuration="5.185733362s" podCreationTimestamp="2025-11-26 15:42:01 +0000 UTC" firstStartedPulling="2025-11-26 15:42:03.127902607 +0000 UTC m=+7891.807104425" lastFinishedPulling="2025-11-26 15:42:03.897668583 +0000 UTC m=+7892.576870401" observedRunningTime="2025-11-26 15:42:06.181545989 +0000 UTC m=+7894.860747827" watchObservedRunningTime="2025-11-26 15:42:06.185733362 +0000 UTC m=+7894.864935180" Nov 26 15:42:06 crc kubenswrapper[5200]: I1126 15:42:06.969555 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:42:06 crc kubenswrapper[5200]: E1126 15:42:06.969994 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:42:11 crc kubenswrapper[5200]: I1126 15:42:11.538638 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:11 crc kubenswrapper[5200]: I1126 15:42:11.539273 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:11 crc kubenswrapper[5200]: I1126 15:42:11.586435 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:12 crc kubenswrapper[5200]: I1126 15:42:12.274582 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:12 crc kubenswrapper[5200]: I1126 15:42:12.329186 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76zp7"] Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.243645 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-76zp7" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerName="registry-server" containerID="cri-o://8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072" gracePeriod=2 Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.756134 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.854299 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-utilities\") pod \"3268a317-1d01-4d53-a1ab-5f6d75932369\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.854670 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxc7g\" (UniqueName: \"kubernetes.io/projected/3268a317-1d01-4d53-a1ab-5f6d75932369-kube-api-access-bxc7g\") pod \"3268a317-1d01-4d53-a1ab-5f6d75932369\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.855226 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-catalog-content\") pod \"3268a317-1d01-4d53-a1ab-5f6d75932369\" (UID: \"3268a317-1d01-4d53-a1ab-5f6d75932369\") " Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.855592 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-utilities" (OuterVolumeSpecName: "utilities") pod "3268a317-1d01-4d53-a1ab-5f6d75932369" (UID: "3268a317-1d01-4d53-a1ab-5f6d75932369"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.859146 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.868790 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3268a317-1d01-4d53-a1ab-5f6d75932369-kube-api-access-bxc7g" (OuterVolumeSpecName: "kube-api-access-bxc7g") pod "3268a317-1d01-4d53-a1ab-5f6d75932369" (UID: "3268a317-1d01-4d53-a1ab-5f6d75932369"). InnerVolumeSpecName "kube-api-access-bxc7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:42:14 crc kubenswrapper[5200]: I1126 15:42:14.961116 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bxc7g\" (UniqueName: \"kubernetes.io/projected/3268a317-1d01-4d53-a1ab-5f6d75932369-kube-api-access-bxc7g\") on node \"crc\" DevicePath \"\"" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.256817 5200 generic.go:358] "Generic (PLEG): container finished" podID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerID="8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072" exitCode=0 Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.256885 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76zp7" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.256968 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76zp7" event={"ID":"3268a317-1d01-4d53-a1ab-5f6d75932369","Type":"ContainerDied","Data":"8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072"} Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.257056 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76zp7" event={"ID":"3268a317-1d01-4d53-a1ab-5f6d75932369","Type":"ContainerDied","Data":"6760a79dbeb57c9c49cce607a53e18834a401699828c4412ce9d037441e35d27"} Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.257092 5200 scope.go:117] "RemoveContainer" containerID="8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.284790 5200 scope.go:117] "RemoveContainer" containerID="3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.297475 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3268a317-1d01-4d53-a1ab-5f6d75932369" (UID: "3268a317-1d01-4d53-a1ab-5f6d75932369"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.308642 5200 scope.go:117] "RemoveContainer" containerID="18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.357698 5200 scope.go:117] "RemoveContainer" containerID="8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072" Nov 26 15:42:15 crc kubenswrapper[5200]: E1126 15:42:15.358332 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072\": container with ID starting with 8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072 not found: ID does not exist" containerID="8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.358515 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072"} err="failed to get container status \"8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072\": rpc error: code = NotFound desc = could not find container \"8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072\": container with ID starting with 8133f44f7838f08e9bad2efdcf0fbfb4a532f8cabdacd9534083fcf58424d072 not found: ID does not exist" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.358657 5200 scope.go:117] "RemoveContainer" containerID="3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a" Nov 26 15:42:15 crc kubenswrapper[5200]: E1126 15:42:15.359549 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a\": container with ID starting with 3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a not found: ID does not exist" containerID="3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.359600 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a"} err="failed to get container status \"3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a\": rpc error: code = NotFound desc = could not find container \"3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a\": container with ID starting with 3e885e940e1612af285da3b92674fba314f3e6f34171366a74c995beaccdbd6a not found: ID does not exist" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.359638 5200 scope.go:117] "RemoveContainer" containerID="18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15" Nov 26 15:42:15 crc kubenswrapper[5200]: E1126 15:42:15.360019 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15\": container with ID starting with 18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15 not found: ID does not exist" containerID="18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.360058 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15"} err="failed to get container status \"18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15\": rpc error: code = NotFound desc = could not find container \"18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15\": container with ID starting with 18c98e3c285c1e62d03537956641066d19501e3e6ab875feb530ef682944af15 not found: ID does not exist" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.374056 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3268a317-1d01-4d53-a1ab-5f6d75932369-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.603808 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76zp7"] Nov 26 15:42:15 crc kubenswrapper[5200]: I1126 15:42:15.618595 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-76zp7"] Nov 26 15:42:16 crc kubenswrapper[5200]: I1126 15:42:16.982878 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" path="/var/lib/kubelet/pods/3268a317-1d01-4d53-a1ab-5f6d75932369/volumes" Nov 26 15:42:21 crc kubenswrapper[5200]: I1126 15:42:21.969489 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:42:21 crc kubenswrapper[5200]: E1126 15:42:21.970837 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:42:35 crc kubenswrapper[5200]: I1126 15:42:35.970800 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:42:35 crc kubenswrapper[5200]: E1126 15:42:35.971738 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:42:42 crc kubenswrapper[5200]: E1126 15:42:42.552496 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441787 actualBytes=10240 Nov 26 15:42:50 crc kubenswrapper[5200]: I1126 15:42:50.969665 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:42:50 crc kubenswrapper[5200]: E1126 15:42:50.970625 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:43:01 crc kubenswrapper[5200]: I1126 15:43:01.970952 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:43:01 crc kubenswrapper[5200]: E1126 15:43:01.971847 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:43:13 crc kubenswrapper[5200]: I1126 15:43:13.969775 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:43:13 crc kubenswrapper[5200]: E1126 15:43:13.970952 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:43:24 crc kubenswrapper[5200]: I1126 15:43:24.970982 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:43:24 crc kubenswrapper[5200]: E1126 15:43:24.971815 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:43:36 crc kubenswrapper[5200]: I1126 15:43:36.969401 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:43:36 crc kubenswrapper[5200]: E1126 15:43:36.970377 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:43:42 crc kubenswrapper[5200]: E1126 15:43:42.475544 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441786 actualBytes=10240 Nov 26 15:43:49 crc kubenswrapper[5200]: I1126 15:43:49.969532 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:43:49 crc kubenswrapper[5200]: E1126 15:43:49.970585 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:44:00 crc kubenswrapper[5200]: I1126 15:44:00.970241 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:44:00 crc kubenswrapper[5200]: E1126 15:44:00.971544 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:44:14 crc kubenswrapper[5200]: I1126 15:44:14.970174 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:44:14 crc kubenswrapper[5200]: E1126 15:44:14.971231 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:44:29 crc kubenswrapper[5200]: I1126 15:44:29.969890 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:44:29 crc kubenswrapper[5200]: E1126 15:44:29.970715 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:44:42 crc kubenswrapper[5200]: E1126 15:44:42.604981 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441785 actualBytes=10240 Nov 26 15:44:43 crc kubenswrapper[5200]: I1126 15:44:43.970030 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:44:43 crc kubenswrapper[5200]: E1126 15:44:43.970725 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:44:56 crc kubenswrapper[5200]: I1126 15:44:56.970333 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:44:56 crc kubenswrapper[5200]: E1126 15:44:56.972254 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.171081 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj"] Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.173471 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerName="extract-utilities" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.173503 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerName="extract-utilities" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.173553 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerName="registry-server" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.173559 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerName="registry-server" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.173571 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerName="extract-content" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.173577 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerName="extract-content" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.173852 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="3268a317-1d01-4d53-a1ab-5f6d75932369" containerName="registry-server" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.182928 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj"] Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.183086 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.186622 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.186945 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.264395 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/942bddb0-ba7c-414c-9601-4455784b5099-secret-volume\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.264473 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxcj\" (UniqueName: \"kubernetes.io/projected/942bddb0-ba7c-414c-9601-4455784b5099-kube-api-access-scxcj\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.265017 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/942bddb0-ba7c-414c-9601-4455784b5099-config-volume\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.368596 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/942bddb0-ba7c-414c-9601-4455784b5099-config-volume\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.368708 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/942bddb0-ba7c-414c-9601-4455784b5099-secret-volume\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.368763 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scxcj\" (UniqueName: \"kubernetes.io/projected/942bddb0-ba7c-414c-9601-4455784b5099-kube-api-access-scxcj\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.369753 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/942bddb0-ba7c-414c-9601-4455784b5099-config-volume\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.383129 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/942bddb0-ba7c-414c-9601-4455784b5099-secret-volume\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.386497 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxcj\" (UniqueName: \"kubernetes.io/projected/942bddb0-ba7c-414c-9601-4455784b5099-kube-api-access-scxcj\") pod \"collect-profiles-29402865-bd8xj\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.509010 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.971411 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:45:00 crc kubenswrapper[5200]: I1126 15:45:00.982344 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj"] Nov 26 15:45:01 crc kubenswrapper[5200]: I1126 15:45:01.064496 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" event={"ID":"942bddb0-ba7c-414c-9601-4455784b5099","Type":"ContainerStarted","Data":"ffeaf896ab13da228c718e6064aad55535620b8c707f3c81812ed0cab1dc536c"} Nov 26 15:45:02 crc kubenswrapper[5200]: I1126 15:45:02.076449 5200 generic.go:358] "Generic (PLEG): container finished" podID="942bddb0-ba7c-414c-9601-4455784b5099" containerID="48d4192c834a81851c31c3eca90092e838187bcc3980c1a1a6f6c9939a623809" exitCode=0 Nov 26 15:45:02 crc kubenswrapper[5200]: I1126 15:45:02.076628 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" event={"ID":"942bddb0-ba7c-414c-9601-4455784b5099","Type":"ContainerDied","Data":"48d4192c834a81851c31c3eca90092e838187bcc3980c1a1a6f6c9939a623809"} Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.501741 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.641337 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/942bddb0-ba7c-414c-9601-4455784b5099-config-volume\") pod \"942bddb0-ba7c-414c-9601-4455784b5099\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.641480 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scxcj\" (UniqueName: \"kubernetes.io/projected/942bddb0-ba7c-414c-9601-4455784b5099-kube-api-access-scxcj\") pod \"942bddb0-ba7c-414c-9601-4455784b5099\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.641783 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/942bddb0-ba7c-414c-9601-4455784b5099-secret-volume\") pod \"942bddb0-ba7c-414c-9601-4455784b5099\" (UID: \"942bddb0-ba7c-414c-9601-4455784b5099\") " Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.642325 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/942bddb0-ba7c-414c-9601-4455784b5099-config-volume" (OuterVolumeSpecName: "config-volume") pod "942bddb0-ba7c-414c-9601-4455784b5099" (UID: "942bddb0-ba7c-414c-9601-4455784b5099"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.647977 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942bddb0-ba7c-414c-9601-4455784b5099-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "942bddb0-ba7c-414c-9601-4455784b5099" (UID: "942bddb0-ba7c-414c-9601-4455784b5099"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.648444 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942bddb0-ba7c-414c-9601-4455784b5099-kube-api-access-scxcj" (OuterVolumeSpecName: "kube-api-access-scxcj") pod "942bddb0-ba7c-414c-9601-4455784b5099" (UID: "942bddb0-ba7c-414c-9601-4455784b5099"). InnerVolumeSpecName "kube-api-access-scxcj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.744362 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/942bddb0-ba7c-414c-9601-4455784b5099-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.744397 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-scxcj\" (UniqueName: \"kubernetes.io/projected/942bddb0-ba7c-414c-9601-4455784b5099-kube-api-access-scxcj\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:03 crc kubenswrapper[5200]: I1126 15:45:03.744410 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/942bddb0-ba7c-414c-9601-4455784b5099-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:04 crc kubenswrapper[5200]: I1126 15:45:04.111277 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" event={"ID":"942bddb0-ba7c-414c-9601-4455784b5099","Type":"ContainerDied","Data":"ffeaf896ab13da228c718e6064aad55535620b8c707f3c81812ed0cab1dc536c"} Nov 26 15:45:04 crc kubenswrapper[5200]: I1126 15:45:04.111304 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402865-bd8xj" Nov 26 15:45:04 crc kubenswrapper[5200]: I1126 15:45:04.111323 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffeaf896ab13da228c718e6064aad55535620b8c707f3c81812ed0cab1dc536c" Nov 26 15:45:04 crc kubenswrapper[5200]: I1126 15:45:04.597950 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp"] Nov 26 15:45:04 crc kubenswrapper[5200]: I1126 15:45:04.626917 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-fjwsp"] Nov 26 15:45:04 crc kubenswrapper[5200]: I1126 15:45:04.981559 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d" path="/var/lib/kubelet/pods/a2f32ebe-2b7b-4a3d-b2b4-0b8a1ac3bc0d/volumes" Nov 26 15:45:09 crc kubenswrapper[5200]: I1126 15:45:09.969803 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:45:09 crc kubenswrapper[5200]: E1126 15:45:09.970868 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:45:22 crc kubenswrapper[5200]: I1126 15:45:22.982408 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:45:22 crc kubenswrapper[5200]: E1126 15:45:22.983313 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:45:34 crc kubenswrapper[5200]: I1126 15:45:34.442808 5200 generic.go:358] "Generic (PLEG): container finished" podID="b36710e1-d61f-4fcc-9c44-d1298dedac90" containerID="1657627c9a0ec245b637864098ce3d223488d0665ad04a9190f8f1b138ab4502" exitCode=0 Nov 26 15:45:34 crc kubenswrapper[5200]: I1126 15:45:34.442899 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" event={"ID":"b36710e1-d61f-4fcc-9c44-d1298dedac90","Type":"ContainerDied","Data":"1657627c9a0ec245b637864098ce3d223488d0665ad04a9190f8f1b138ab4502"} Nov 26 15:45:34 crc kubenswrapper[5200]: I1126 15:45:34.969515 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:45:34 crc kubenswrapper[5200]: E1126 15:45:34.970590 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:45:35 crc kubenswrapper[5200]: I1126 15:45:35.124362 5200 scope.go:117] "RemoveContainer" containerID="13b1bf8445b805d72c8c617956efabb4d82b7967303a50729266a9119403b78d" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.014379 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.129113 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ssh-key\") pod \"b36710e1-d61f-4fcc-9c44-d1298dedac90\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.129365 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-secret-0\") pod \"b36710e1-d61f-4fcc-9c44-d1298dedac90\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.129485 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ftgf\" (UniqueName: \"kubernetes.io/projected/b36710e1-d61f-4fcc-9c44-d1298dedac90-kube-api-access-2ftgf\") pod \"b36710e1-d61f-4fcc-9c44-d1298dedac90\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.129577 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ceph\") pod \"b36710e1-d61f-4fcc-9c44-d1298dedac90\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.129646 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-combined-ca-bundle\") pod \"b36710e1-d61f-4fcc-9c44-d1298dedac90\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.129844 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-inventory\") pod \"b36710e1-d61f-4fcc-9c44-d1298dedac90\" (UID: \"b36710e1-d61f-4fcc-9c44-d1298dedac90\") " Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.135699 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ceph" (OuterVolumeSpecName: "ceph") pod "b36710e1-d61f-4fcc-9c44-d1298dedac90" (UID: "b36710e1-d61f-4fcc-9c44-d1298dedac90"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.135757 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36710e1-d61f-4fcc-9c44-d1298dedac90-kube-api-access-2ftgf" (OuterVolumeSpecName: "kube-api-access-2ftgf") pod "b36710e1-d61f-4fcc-9c44-d1298dedac90" (UID: "b36710e1-d61f-4fcc-9c44-d1298dedac90"). InnerVolumeSpecName "kube-api-access-2ftgf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.135902 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b36710e1-d61f-4fcc-9c44-d1298dedac90" (UID: "b36710e1-d61f-4fcc-9c44-d1298dedac90"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.161967 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-inventory" (OuterVolumeSpecName: "inventory") pod "b36710e1-d61f-4fcc-9c44-d1298dedac90" (UID: "b36710e1-d61f-4fcc-9c44-d1298dedac90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.163882 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b36710e1-d61f-4fcc-9c44-d1298dedac90" (UID: "b36710e1-d61f-4fcc-9c44-d1298dedac90"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.164028 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b36710e1-d61f-4fcc-9c44-d1298dedac90" (UID: "b36710e1-d61f-4fcc-9c44-d1298dedac90"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.232410 5200 reconciler_common.go:299] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.232447 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ftgf\" (UniqueName: \"kubernetes.io/projected/b36710e1-d61f-4fcc-9c44-d1298dedac90-kube-api-access-2ftgf\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.232460 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.232470 5200 reconciler_common.go:299] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.232479 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.232487 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b36710e1-d61f-4fcc-9c44-d1298dedac90-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.481250 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" event={"ID":"b36710e1-d61f-4fcc-9c44-d1298dedac90","Type":"ContainerDied","Data":"410a82a7082c0df4e99531066e29554913eb2bd16d7868ee179bcfb92b2a92af"} Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.481627 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="410a82a7082c0df4e99531066e29554913eb2bd16d7868ee179bcfb92b2a92af" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.481728 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-m9k4v" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.566423 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-fxsch"] Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.567732 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b36710e1-d61f-4fcc-9c44-d1298dedac90" containerName="libvirt-openstack-openstack-cell1" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.567750 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36710e1-d61f-4fcc-9c44-d1298dedac90" containerName="libvirt-openstack-openstack-cell1" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.567787 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="942bddb0-ba7c-414c-9601-4455784b5099" containerName="collect-profiles" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.567793 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="942bddb0-ba7c-414c-9601-4455784b5099" containerName="collect-profiles" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.568030 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="b36710e1-d61f-4fcc-9c44-d1298dedac90" containerName="libvirt-openstack-openstack-cell1" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.568053 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="942bddb0-ba7c-414c-9601-4455784b5099" containerName="collect-profiles" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.578580 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.585283 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.585537 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"nova-cells-global-config\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.585559 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.585699 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-migration-ssh-key\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.585804 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.586098 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.588169 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-fxsch"] Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.590275 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-compute-config\"" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641104 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-inventory\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641155 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641175 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641255 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ceph\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641309 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641342 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641539 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqd9\" (UniqueName: \"kubernetes.io/projected/047cc627-0c41-454c-a21e-ef04470f6900-kube-api-access-zsqd9\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641646 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641854 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.641936 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.642348 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.744523 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.744575 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-inventory\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.744607 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.744633 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.744668 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ceph\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.745381 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.745424 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.745488 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqd9\" (UniqueName: \"kubernetes.io/projected/047cc627-0c41-454c-a21e-ef04470f6900-kube-api-access-zsqd9\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.745558 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.745709 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.745766 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.746340 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.746612 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.749513 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.750304 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-inventory\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.751129 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.751158 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.752486 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ceph\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.753147 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.755642 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.756898 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.764315 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqd9\" (UniqueName: \"kubernetes.io/projected/047cc627-0c41-454c-a21e-ef04470f6900-kube-api-access-zsqd9\") pod \"nova-cell1-openstack-openstack-cell1-fxsch\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:36 crc kubenswrapper[5200]: I1126 15:45:36.904734 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:45:37 crc kubenswrapper[5200]: I1126 15:45:37.500753 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-fxsch"] Nov 26 15:45:38 crc kubenswrapper[5200]: I1126 15:45:38.531709 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" event={"ID":"047cc627-0c41-454c-a21e-ef04470f6900","Type":"ContainerStarted","Data":"d571a526adfc3fa5795c9fecc2b0454e84f7b9c6c81c5137a1241b07382ee2c3"} Nov 26 15:45:38 crc kubenswrapper[5200]: I1126 15:45:38.532036 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" event={"ID":"047cc627-0c41-454c-a21e-ef04470f6900","Type":"ContainerStarted","Data":"b530a04fa1460a4622fecbd60ba7817c2f443d77ca4c71888b600f565ba47818"} Nov 26 15:45:38 crc kubenswrapper[5200]: I1126 15:45:38.561890 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" podStartSLOduration=2.123472164 podStartE2EDuration="2.561862577s" podCreationTimestamp="2025-11-26 15:45:36 +0000 UTC" firstStartedPulling="2025-11-26 15:45:37.494450017 +0000 UTC m=+8106.173651855" lastFinishedPulling="2025-11-26 15:45:37.93284044 +0000 UTC m=+8106.612042268" observedRunningTime="2025-11-26 15:45:38.556869853 +0000 UTC m=+8107.236071741" watchObservedRunningTime="2025-11-26 15:45:38.561862577 +0000 UTC m=+8107.241064465" Nov 26 15:45:38 crc kubenswrapper[5200]: I1126 15:45:38.702857 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:45:38 crc kubenswrapper[5200]: I1126 15:45:38.708522 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:45:38 crc kubenswrapper[5200]: I1126 15:45:38.712133 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:45:38 crc kubenswrapper[5200]: I1126 15:45:38.717220 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:45:42 crc kubenswrapper[5200]: E1126 15:45:42.696472 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443138 actualBytes=10240 Nov 26 15:45:46 crc kubenswrapper[5200]: I1126 15:45:46.970112 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:45:46 crc kubenswrapper[5200]: E1126 15:45:46.971014 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:45:58 crc kubenswrapper[5200]: I1126 15:45:58.969210 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:45:58 crc kubenswrapper[5200]: E1126 15:45:58.970202 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:46:11 crc kubenswrapper[5200]: I1126 15:46:11.969650 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:46:12 crc kubenswrapper[5200]: I1126 15:46:12.916543 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"ef9ad0f58e5f8e592093465b409886aff1f705174610696f3abc44e2e24c6402"} Nov 26 15:46:42 crc kubenswrapper[5200]: E1126 15:46:42.767103 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443131 actualBytes=10240 Nov 26 15:47:42 crc kubenswrapper[5200]: E1126 15:47:42.516005 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443131 actualBytes=10240 Nov 26 15:48:33 crc kubenswrapper[5200]: I1126 15:48:33.154992 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:48:33 crc kubenswrapper[5200]: I1126 15:48:33.155840 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:48:43 crc kubenswrapper[5200]: E1126 15:48:43.538529 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441806 actualBytes=10240 Nov 26 15:48:43 crc kubenswrapper[5200]: I1126 15:48:43.606938 5200 generic.go:358] "Generic (PLEG): container finished" podID="047cc627-0c41-454c-a21e-ef04470f6900" containerID="d571a526adfc3fa5795c9fecc2b0454e84f7b9c6c81c5137a1241b07382ee2c3" exitCode=0 Nov 26 15:48:43 crc kubenswrapper[5200]: I1126 15:48:43.607029 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" event={"ID":"047cc627-0c41-454c-a21e-ef04470f6900","Type":"ContainerDied","Data":"d571a526adfc3fa5795c9fecc2b0454e84f7b9c6c81c5137a1241b07382ee2c3"} Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.265732 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.382377 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-combined-ca-bundle\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.382776 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsqd9\" (UniqueName: \"kubernetes.io/projected/047cc627-0c41-454c-a21e-ef04470f6900-kube-api-access-zsqd9\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.382859 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-0\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.382907 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-1\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.383061 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-inventory\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.383111 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-0\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.383205 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-1\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.383361 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ssh-key\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.383553 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ceph\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.383643 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-1\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.383745 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-0\") pod \"047cc627-0c41-454c-a21e-ef04470f6900\" (UID: \"047cc627-0c41-454c-a21e-ef04470f6900\") " Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.389055 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.389909 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ceph" (OuterVolumeSpecName: "ceph") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.389991 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047cc627-0c41-454c-a21e-ef04470f6900-kube-api-access-zsqd9" (OuterVolumeSpecName: "kube-api-access-zsqd9") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "kube-api-access-zsqd9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.418525 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-inventory" (OuterVolumeSpecName: "inventory") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.418913 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.418985 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.424067 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.425484 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.426977 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.428487 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.431979 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "047cc627-0c41-454c-a21e-ef04470f6900" (UID: "047cc627-0c41-454c-a21e-ef04470f6900"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487045 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487082 5200 reconciler_common.go:299] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487093 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487102 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487109 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487117 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487125 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487133 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487142 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsqd9\" (UniqueName: \"kubernetes.io/projected/047cc627-0c41-454c-a21e-ef04470f6900-kube-api-access-zsqd9\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487154 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/047cc627-0c41-454c-a21e-ef04470f6900-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.487163 5200 reconciler_common.go:299] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/047cc627-0c41-454c-a21e-ef04470f6900-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.641352 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" event={"ID":"047cc627-0c41-454c-a21e-ef04470f6900","Type":"ContainerDied","Data":"b530a04fa1460a4622fecbd60ba7817c2f443d77ca4c71888b600f565ba47818"} Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.641380 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-fxsch" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.641402 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b530a04fa1460a4622fecbd60ba7817c2f443d77ca4c71888b600f565ba47818" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.743628 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-4vqb4"] Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.745513 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="047cc627-0c41-454c-a21e-ef04470f6900" containerName="nova-cell1-openstack-openstack-cell1" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.745540 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="047cc627-0c41-454c-a21e-ef04470f6900" containerName="nova-cell1-openstack-openstack-cell1" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.745801 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="047cc627-0c41-454c-a21e-ef04470f6900" containerName="nova-cell1-openstack-openstack-cell1" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.751533 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.758352 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-4vqb4"] Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.790849 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.791258 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-compute-config-data\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.791535 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.791655 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.791816 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.898916 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ssh-key\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.899201 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.899402 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceph\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.899478 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blzdt\" (UniqueName: \"kubernetes.io/projected/70357171-3962-45c5-9472-042e377c456b-kube-api-access-blzdt\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.899551 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.899612 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.899681 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:45 crc kubenswrapper[5200]: I1126 15:48:45.899755 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-inventory\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.005143 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.006424 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-inventory\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.006940 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ssh-key\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.007438 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.008018 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceph\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.008840 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blzdt\" (UniqueName: \"kubernetes.io/projected/70357171-3962-45c5-9472-042e377c456b-kube-api-access-blzdt\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.010067 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.010606 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.010242 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-inventory\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.010242 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.013193 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.013354 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ssh-key\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.015869 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.017134 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.029457 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceph\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.039753 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blzdt\" (UniqueName: \"kubernetes.io/projected/70357171-3962-45c5-9472-042e377c456b-kube-api-access-blzdt\") pod \"telemetry-openstack-openstack-cell1-4vqb4\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.106144 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:48:46 crc kubenswrapper[5200]: I1126 15:48:46.678320 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-4vqb4"] Nov 26 15:48:47 crc kubenswrapper[5200]: I1126 15:48:47.662560 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" event={"ID":"70357171-3962-45c5-9472-042e377c456b","Type":"ContainerStarted","Data":"94effb33bafea762f26cdae46d79d4439eb5f1be328159e1e6ab9bd16e5e9efd"} Nov 26 15:48:48 crc kubenswrapper[5200]: I1126 15:48:48.673111 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" event={"ID":"70357171-3962-45c5-9472-042e377c456b","Type":"ContainerStarted","Data":"c0303fb14533f08822e7497e3eb2493a2d54b2431b059bc31441ef8827c5f2f8"} Nov 26 15:48:48 crc kubenswrapper[5200]: I1126 15:48:48.694076 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" podStartSLOduration=2.8523550760000003 podStartE2EDuration="3.694061709s" podCreationTimestamp="2025-11-26 15:48:45 +0000 UTC" firstStartedPulling="2025-11-26 15:48:46.680896613 +0000 UTC m=+8295.360098431" lastFinishedPulling="2025-11-26 15:48:47.522603246 +0000 UTC m=+8296.201805064" observedRunningTime="2025-11-26 15:48:48.691649984 +0000 UTC m=+8297.370851802" watchObservedRunningTime="2025-11-26 15:48:48.694061709 +0000 UTC m=+8297.373263527" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.708781 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wt78x"] Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.722016 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wt78x"] Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.722075 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.790025 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-utilities\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.790679 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8trt\" (UniqueName: \"kubernetes.io/projected/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-kube-api-access-w8trt\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.790747 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-catalog-content\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.892767 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-utilities\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.892987 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8trt\" (UniqueName: \"kubernetes.io/projected/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-kube-api-access-w8trt\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.893025 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-catalog-content\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.893361 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-utilities\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.893554 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-catalog-content\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:52 crc kubenswrapper[5200]: I1126 15:48:52.920979 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8trt\" (UniqueName: \"kubernetes.io/projected/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-kube-api-access-w8trt\") pod \"certified-operators-wt78x\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:53 crc kubenswrapper[5200]: I1126 15:48:53.056072 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:48:53 crc kubenswrapper[5200]: I1126 15:48:53.587989 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wt78x"] Nov 26 15:48:53 crc kubenswrapper[5200]: W1126 15:48:53.593017 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbee79a15_7cd4_4d10_aefc_3d3f58dc7c56.slice/crio-e70f3a29a7249a14834f3ac3fc3bc5c0dc27556ec81856e60cefce77a53709c7 WatchSource:0}: Error finding container e70f3a29a7249a14834f3ac3fc3bc5c0dc27556ec81856e60cefce77a53709c7: Status 404 returned error can't find the container with id e70f3a29a7249a14834f3ac3fc3bc5c0dc27556ec81856e60cefce77a53709c7 Nov 26 15:48:53 crc kubenswrapper[5200]: I1126 15:48:53.735740 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wt78x" event={"ID":"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56","Type":"ContainerStarted","Data":"e70f3a29a7249a14834f3ac3fc3bc5c0dc27556ec81856e60cefce77a53709c7"} Nov 26 15:48:54 crc kubenswrapper[5200]: I1126 15:48:54.748537 5200 generic.go:358] "Generic (PLEG): container finished" podID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerID="9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba" exitCode=0 Nov 26 15:48:54 crc kubenswrapper[5200]: I1126 15:48:54.748943 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wt78x" event={"ID":"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56","Type":"ContainerDied","Data":"9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba"} Nov 26 15:48:55 crc kubenswrapper[5200]: I1126 15:48:55.760636 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wt78x" event={"ID":"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56","Type":"ContainerStarted","Data":"4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071"} Nov 26 15:48:56 crc kubenswrapper[5200]: I1126 15:48:56.777388 5200 generic.go:358] "Generic (PLEG): container finished" podID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerID="4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071" exitCode=0 Nov 26 15:48:56 crc kubenswrapper[5200]: I1126 15:48:56.777568 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wt78x" event={"ID":"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56","Type":"ContainerDied","Data":"4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071"} Nov 26 15:48:57 crc kubenswrapper[5200]: I1126 15:48:57.788645 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wt78x" event={"ID":"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56","Type":"ContainerStarted","Data":"49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e"} Nov 26 15:48:57 crc kubenswrapper[5200]: I1126 15:48:57.814199 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wt78x" podStartSLOduration=5.128179235 podStartE2EDuration="5.814180587s" podCreationTimestamp="2025-11-26 15:48:52 +0000 UTC" firstStartedPulling="2025-11-26 15:48:54.750270487 +0000 UTC m=+8303.429472325" lastFinishedPulling="2025-11-26 15:48:55.436271859 +0000 UTC m=+8304.115473677" observedRunningTime="2025-11-26 15:48:57.803085568 +0000 UTC m=+8306.482287386" watchObservedRunningTime="2025-11-26 15:48:57.814180587 +0000 UTC m=+8306.493382405" Nov 26 15:49:03 crc kubenswrapper[5200]: I1126 15:49:03.056577 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:49:03 crc kubenswrapper[5200]: I1126 15:49:03.057030 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:49:03 crc kubenswrapper[5200]: I1126 15:49:03.109654 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:49:03 crc kubenswrapper[5200]: I1126 15:49:03.153895 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:49:03 crc kubenswrapper[5200]: I1126 15:49:03.153953 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:49:03 crc kubenswrapper[5200]: I1126 15:49:03.889745 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:49:03 crc kubenswrapper[5200]: I1126 15:49:03.935581 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wt78x"] Nov 26 15:49:05 crc kubenswrapper[5200]: I1126 15:49:05.862706 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wt78x" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerName="registry-server" containerID="cri-o://49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e" gracePeriod=2 Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.343957 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.506786 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-utilities\") pod \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.506846 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8trt\" (UniqueName: \"kubernetes.io/projected/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-kube-api-access-w8trt\") pod \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.507254 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-catalog-content\") pod \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\" (UID: \"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56\") " Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.507913 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-utilities" (OuterVolumeSpecName: "utilities") pod "bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" (UID: "bee79a15-7cd4-4d10-aefc-3d3f58dc7c56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.513013 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-kube-api-access-w8trt" (OuterVolumeSpecName: "kube-api-access-w8trt") pod "bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" (UID: "bee79a15-7cd4-4d10-aefc-3d3f58dc7c56"). InnerVolumeSpecName "kube-api-access-w8trt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.530811 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" (UID: "bee79a15-7cd4-4d10-aefc-3d3f58dc7c56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.610158 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.610195 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.610204 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8trt\" (UniqueName: \"kubernetes.io/projected/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56-kube-api-access-w8trt\") on node \"crc\" DevicePath \"\"" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.874909 5200 generic.go:358] "Generic (PLEG): container finished" podID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerID="49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e" exitCode=0 Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.874961 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wt78x" event={"ID":"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56","Type":"ContainerDied","Data":"49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e"} Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.874996 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wt78x" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.875366 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wt78x" event={"ID":"bee79a15-7cd4-4d10-aefc-3d3f58dc7c56","Type":"ContainerDied","Data":"e70f3a29a7249a14834f3ac3fc3bc5c0dc27556ec81856e60cefce77a53709c7"} Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.875403 5200 scope.go:117] "RemoveContainer" containerID="49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.908809 5200 scope.go:117] "RemoveContainer" containerID="4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.921803 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wt78x"] Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.935757 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wt78x"] Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.939598 5200 scope.go:117] "RemoveContainer" containerID="9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.982527 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" path="/var/lib/kubelet/pods/bee79a15-7cd4-4d10-aefc-3d3f58dc7c56/volumes" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.985775 5200 scope.go:117] "RemoveContainer" containerID="49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e" Nov 26 15:49:06 crc kubenswrapper[5200]: E1126 15:49:06.986202 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e\": container with ID starting with 49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e not found: ID does not exist" containerID="49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.986310 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e"} err="failed to get container status \"49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e\": rpc error: code = NotFound desc = could not find container \"49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e\": container with ID starting with 49d937152a46a582bbb93252779642dd8563677ad2dc082548aa2bf47bedcf2e not found: ID does not exist" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.986395 5200 scope.go:117] "RemoveContainer" containerID="4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071" Nov 26 15:49:06 crc kubenswrapper[5200]: E1126 15:49:06.986764 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071\": container with ID starting with 4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071 not found: ID does not exist" containerID="4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.986890 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071"} err="failed to get container status \"4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071\": rpc error: code = NotFound desc = could not find container \"4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071\": container with ID starting with 4b67d25962c7f382f2b3d0702e1e688aae0593fa0edb4447626a7ffd4152d071 not found: ID does not exist" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.986968 5200 scope.go:117] "RemoveContainer" containerID="9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba" Nov 26 15:49:06 crc kubenswrapper[5200]: E1126 15:49:06.987253 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba\": container with ID starting with 9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba not found: ID does not exist" containerID="9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba" Nov 26 15:49:06 crc kubenswrapper[5200]: I1126 15:49:06.987336 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba"} err="failed to get container status \"9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba\": rpc error: code = NotFound desc = could not find container \"9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba\": container with ID starting with 9414b474349b4e8b61ca31daa707c4a9e811db81ca4a42d52539dc69b84cc8ba not found: ID does not exist" Nov 26 15:49:33 crc kubenswrapper[5200]: I1126 15:49:33.155215 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:49:33 crc kubenswrapper[5200]: I1126 15:49:33.155887 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:49:33 crc kubenswrapper[5200]: I1126 15:49:33.155962 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:49:33 crc kubenswrapper[5200]: I1126 15:49:33.156989 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef9ad0f58e5f8e592093465b409886aff1f705174610696f3abc44e2e24c6402"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:49:33 crc kubenswrapper[5200]: I1126 15:49:33.157070 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://ef9ad0f58e5f8e592093465b409886aff1f705174610696f3abc44e2e24c6402" gracePeriod=600 Nov 26 15:49:34 crc kubenswrapper[5200]: I1126 15:49:34.216132 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="ef9ad0f58e5f8e592093465b409886aff1f705174610696f3abc44e2e24c6402" exitCode=0 Nov 26 15:49:34 crc kubenswrapper[5200]: I1126 15:49:34.216204 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"ef9ad0f58e5f8e592093465b409886aff1f705174610696f3abc44e2e24c6402"} Nov 26 15:49:34 crc kubenswrapper[5200]: I1126 15:49:34.216813 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3"} Nov 26 15:49:34 crc kubenswrapper[5200]: I1126 15:49:34.216833 5200 scope.go:117] "RemoveContainer" containerID="5cf5b02fcdba1526cf2d9dcb9951bc7beeac373387bb6f30bc628bccee960f02" Nov 26 15:49:42 crc kubenswrapper[5200]: E1126 15:49:42.628868 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441806 actualBytes=10240 Nov 26 15:50:38 crc kubenswrapper[5200]: I1126 15:50:38.875092 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:50:38 crc kubenswrapper[5200]: I1126 15:50:38.881571 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:50:38 crc kubenswrapper[5200]: I1126 15:50:38.883986 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:50:38 crc kubenswrapper[5200]: I1126 15:50:38.888420 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:50:43 crc kubenswrapper[5200]: E1126 15:50:43.045484 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441797 actualBytes=10240 Nov 26 15:51:33 crc kubenswrapper[5200]: I1126 15:51:33.154598 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:51:33 crc kubenswrapper[5200]: I1126 15:51:33.155250 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:51:42 crc kubenswrapper[5200]: E1126 15:51:42.489944 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441799 actualBytes=10240 Nov 26 15:52:03 crc kubenswrapper[5200]: I1126 15:52:03.155308 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:52:03 crc kubenswrapper[5200]: I1126 15:52:03.156063 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.154744 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.155346 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.155393 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.156297 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.156355 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" gracePeriod=600 Nov 26 15:52:33 crc kubenswrapper[5200]: E1126 15:52:33.317364 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.503585 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" exitCode=0 Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.503657 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3"} Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.503725 5200 scope.go:117] "RemoveContainer" containerID="ef9ad0f58e5f8e592093465b409886aff1f705174610696f3abc44e2e24c6402" Nov 26 15:52:33 crc kubenswrapper[5200]: I1126 15:52:33.504405 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:52:33 crc kubenswrapper[5200]: E1126 15:52:33.504930 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:52:43 crc kubenswrapper[5200]: E1126 15:52:43.026357 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441799 actualBytes=10240 Nov 26 15:52:44 crc kubenswrapper[5200]: I1126 15:52:44.970109 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:52:44 crc kubenswrapper[5200]: E1126 15:52:44.970874 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:52:47 crc kubenswrapper[5200]: I1126 15:52:47.713622 5200 generic.go:358] "Generic (PLEG): container finished" podID="70357171-3962-45c5-9472-042e377c456b" containerID="c0303fb14533f08822e7497e3eb2493a2d54b2431b059bc31441ef8827c5f2f8" exitCode=0 Nov 26 15:52:47 crc kubenswrapper[5200]: I1126 15:52:47.713714 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" event={"ID":"70357171-3962-45c5-9472-042e377c456b","Type":"ContainerDied","Data":"c0303fb14533f08822e7497e3eb2493a2d54b2431b059bc31441ef8827c5f2f8"} Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.338445 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.475143 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-inventory\") pod \"70357171-3962-45c5-9472-042e377c456b\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.475244 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-1\") pod \"70357171-3962-45c5-9472-042e377c456b\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.475290 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-0\") pod \"70357171-3962-45c5-9472-042e377c456b\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.475457 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceph\") pod \"70357171-3962-45c5-9472-042e377c456b\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.475526 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-2\") pod \"70357171-3962-45c5-9472-042e377c456b\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.475931 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ssh-key\") pod \"70357171-3962-45c5-9472-042e377c456b\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.475989 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blzdt\" (UniqueName: \"kubernetes.io/projected/70357171-3962-45c5-9472-042e377c456b-kube-api-access-blzdt\") pod \"70357171-3962-45c5-9472-042e377c456b\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.476811 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-telemetry-combined-ca-bundle\") pod \"70357171-3962-45c5-9472-042e377c456b\" (UID: \"70357171-3962-45c5-9472-042e377c456b\") " Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.484999 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "70357171-3962-45c5-9472-042e377c456b" (UID: "70357171-3962-45c5-9472-042e377c456b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.485064 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceph" (OuterVolumeSpecName: "ceph") pod "70357171-3962-45c5-9472-042e377c456b" (UID: "70357171-3962-45c5-9472-042e377c456b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.493156 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70357171-3962-45c5-9472-042e377c456b-kube-api-access-blzdt" (OuterVolumeSpecName: "kube-api-access-blzdt") pod "70357171-3962-45c5-9472-042e377c456b" (UID: "70357171-3962-45c5-9472-042e377c456b"). InnerVolumeSpecName "kube-api-access-blzdt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.515155 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "70357171-3962-45c5-9472-042e377c456b" (UID: "70357171-3962-45c5-9472-042e377c456b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.523031 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "70357171-3962-45c5-9472-042e377c456b" (UID: "70357171-3962-45c5-9472-042e377c456b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.541531 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-inventory" (OuterVolumeSpecName: "inventory") pod "70357171-3962-45c5-9472-042e377c456b" (UID: "70357171-3962-45c5-9472-042e377c456b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.546064 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70357171-3962-45c5-9472-042e377c456b" (UID: "70357171-3962-45c5-9472-042e377c456b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.554236 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "70357171-3962-45c5-9472-042e377c456b" (UID: "70357171-3962-45c5-9472-042e377c456b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.581658 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.581709 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-blzdt\" (UniqueName: \"kubernetes.io/projected/70357171-3962-45c5-9472-042e377c456b-kube-api-access-blzdt\") on node \"crc\" DevicePath \"\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.581725 5200 reconciler_common.go:299] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.581738 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.581751 5200 reconciler_common.go:299] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.581763 5200 reconciler_common.go:299] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.581774 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.581787 5200 reconciler_common.go:299] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70357171-3962-45c5-9472-042e377c456b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.771901 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.775130 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-4vqb4" event={"ID":"70357171-3962-45c5-9472-042e377c456b","Type":"ContainerDied","Data":"94effb33bafea762f26cdae46d79d4439eb5f1be328159e1e6ab9bd16e5e9efd"} Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.775196 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94effb33bafea762f26cdae46d79d4439eb5f1be328159e1e6ab9bd16e5e9efd" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.870012 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dk8vz"] Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872174 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerName="extract-content" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872202 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerName="extract-content" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872244 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerName="registry-server" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872254 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerName="registry-server" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872267 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerName="extract-utilities" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872276 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerName="extract-utilities" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872291 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70357171-3962-45c5-9472-042e377c456b" containerName="telemetry-openstack-openstack-cell1" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872299 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="70357171-3962-45c5-9472-042e377c456b" containerName="telemetry-openstack-openstack-cell1" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872557 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="70357171-3962-45c5-9472-042e377c456b" containerName="telemetry-openstack-openstack-cell1" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.872578 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bee79a15-7cd4-4d10-aefc-3d3f58dc7c56" containerName="registry-server" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.880880 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.883373 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.883641 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-sriov-agent-neutron-config\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.883818 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.883979 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.884153 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.900401 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dk8vz"] Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.992392 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.992448 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.992534 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.992554 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stnmt\" (UniqueName: \"kubernetes.io/projected/f2843638-a976-41b5-b129-9186db7b9257-kube-api-access-stnmt\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.992618 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:49 crc kubenswrapper[5200]: I1126 15:52:49.992711 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.094594 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.094675 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.094779 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.094803 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stnmt\" (UniqueName: \"kubernetes.io/projected/f2843638-a976-41b5-b129-9186db7b9257-kube-api-access-stnmt\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.094837 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.094963 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.103374 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.103420 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.104781 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.105393 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.107410 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.125841 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stnmt\" (UniqueName: \"kubernetes.io/projected/f2843638-a976-41b5-b129-9186db7b9257-kube-api-access-stnmt\") pod \"neutron-sriov-openstack-openstack-cell1-dk8vz\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.205299 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.812058 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dk8vz"] Nov 26 15:52:50 crc kubenswrapper[5200]: W1126 15:52:50.820456 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2843638_a976_41b5_b129_9186db7b9257.slice/crio-d011c40c426dc8d623f388c7f7f3c5f1c7c63d93e3adf5009e8253699fb43b01 WatchSource:0}: Error finding container d011c40c426dc8d623f388c7f7f3c5f1c7c63d93e3adf5009e8253699fb43b01: Status 404 returned error can't find the container with id d011c40c426dc8d623f388c7f7f3c5f1c7c63d93e3adf5009e8253699fb43b01 Nov 26 15:52:50 crc kubenswrapper[5200]: I1126 15:52:50.822379 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:52:51 crc kubenswrapper[5200]: I1126 15:52:51.806953 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" event={"ID":"f2843638-a976-41b5-b129-9186db7b9257","Type":"ContainerStarted","Data":"d011c40c426dc8d623f388c7f7f3c5f1c7c63d93e3adf5009e8253699fb43b01"} Nov 26 15:52:52 crc kubenswrapper[5200]: I1126 15:52:52.823327 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" event={"ID":"f2843638-a976-41b5-b129-9186db7b9257","Type":"ContainerStarted","Data":"0d364cbab8589a85c8904cc4dff9f198d2d993961db0a76716b22cbccd4e0ee6"} Nov 26 15:52:52 crc kubenswrapper[5200]: I1126 15:52:52.846169 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" podStartSLOduration=3.151139559 podStartE2EDuration="3.846138666s" podCreationTimestamp="2025-11-26 15:52:49 +0000 UTC" firstStartedPulling="2025-11-26 15:52:50.822864425 +0000 UTC m=+8539.502066243" lastFinishedPulling="2025-11-26 15:52:51.517863492 +0000 UTC m=+8540.197065350" observedRunningTime="2025-11-26 15:52:52.842358215 +0000 UTC m=+8541.521560033" watchObservedRunningTime="2025-11-26 15:52:52.846138666 +0000 UTC m=+8541.525340524" Nov 26 15:52:59 crc kubenswrapper[5200]: I1126 15:52:59.969955 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:52:59 crc kubenswrapper[5200]: E1126 15:52:59.970518 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:53:14 crc kubenswrapper[5200]: I1126 15:53:14.969397 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:53:14 crc kubenswrapper[5200]: E1126 15:53:14.970571 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.497745 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xpk9r"] Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.511461 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.512555 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xpk9r"] Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.617663 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-catalog-content\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.618002 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxcdf\" (UniqueName: \"kubernetes.io/projected/9943c6ab-d03f-4613-8f42-b1470d82478a-kube-api-access-qxcdf\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.618197 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-utilities\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.722532 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-catalog-content\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.722885 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxcdf\" (UniqueName: \"kubernetes.io/projected/9943c6ab-d03f-4613-8f42-b1470d82478a-kube-api-access-qxcdf\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.723044 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-utilities\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.740374 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-catalog-content\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.740503 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-utilities\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.754454 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxcdf\" (UniqueName: \"kubernetes.io/projected/9943c6ab-d03f-4613-8f42-b1470d82478a-kube-api-access-qxcdf\") pod \"redhat-operators-xpk9r\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:26 crc kubenswrapper[5200]: I1126 15:53:26.866816 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:27 crc kubenswrapper[5200]: I1126 15:53:27.413336 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xpk9r"] Nov 26 15:53:27 crc kubenswrapper[5200]: I1126 15:53:27.970048 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:53:27 crc kubenswrapper[5200]: E1126 15:53:27.970595 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:53:28 crc kubenswrapper[5200]: I1126 15:53:28.279870 5200 generic.go:358] "Generic (PLEG): container finished" podID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerID="8adef83aae679ab27ca8bdc8dc67ea0015697a2fddc72b662326c4990ec9afc1" exitCode=0 Nov 26 15:53:28 crc kubenswrapper[5200]: I1126 15:53:28.280258 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpk9r" event={"ID":"9943c6ab-d03f-4613-8f42-b1470d82478a","Type":"ContainerDied","Data":"8adef83aae679ab27ca8bdc8dc67ea0015697a2fddc72b662326c4990ec9afc1"} Nov 26 15:53:28 crc kubenswrapper[5200]: I1126 15:53:28.280307 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpk9r" event={"ID":"9943c6ab-d03f-4613-8f42-b1470d82478a","Type":"ContainerStarted","Data":"252c242c8efc87367067a7e05d32d4e39d6bc21097527f30ed83744969deaca2"} Nov 26 15:53:30 crc kubenswrapper[5200]: I1126 15:53:30.312724 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpk9r" event={"ID":"9943c6ab-d03f-4613-8f42-b1470d82478a","Type":"ContainerStarted","Data":"0c40f4ba7ec0d736377641db3885f02a93b7cb4d7746911ae87e78db9d498e99"} Nov 26 15:53:33 crc kubenswrapper[5200]: I1126 15:53:33.351158 5200 generic.go:358] "Generic (PLEG): container finished" podID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerID="0c40f4ba7ec0d736377641db3885f02a93b7cb4d7746911ae87e78db9d498e99" exitCode=0 Nov 26 15:53:33 crc kubenswrapper[5200]: I1126 15:53:33.351239 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpk9r" event={"ID":"9943c6ab-d03f-4613-8f42-b1470d82478a","Type":"ContainerDied","Data":"0c40f4ba7ec0d736377641db3885f02a93b7cb4d7746911ae87e78db9d498e99"} Nov 26 15:53:34 crc kubenswrapper[5200]: I1126 15:53:34.367880 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpk9r" event={"ID":"9943c6ab-d03f-4613-8f42-b1470d82478a","Type":"ContainerStarted","Data":"7f232df270c3c8b3383f7c8cd905c73bfcb3c850ea36a64fc234aa51b8d081d6"} Nov 26 15:53:34 crc kubenswrapper[5200]: I1126 15:53:34.399908 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xpk9r" podStartSLOduration=7.336840902 podStartE2EDuration="8.399887227s" podCreationTimestamp="2025-11-26 15:53:26 +0000 UTC" firstStartedPulling="2025-11-26 15:53:28.281527428 +0000 UTC m=+8576.960729286" lastFinishedPulling="2025-11-26 15:53:29.344573753 +0000 UTC m=+8578.023775611" observedRunningTime="2025-11-26 15:53:34.392589932 +0000 UTC m=+8583.071791790" watchObservedRunningTime="2025-11-26 15:53:34.399887227 +0000 UTC m=+8583.079089055" Nov 26 15:53:34 crc kubenswrapper[5200]: I1126 15:53:34.858247 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-45jxc"] Nov 26 15:53:34 crc kubenswrapper[5200]: I1126 15:53:34.871981 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:34 crc kubenswrapper[5200]: I1126 15:53:34.916805 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45jxc"] Nov 26 15:53:34 crc kubenswrapper[5200]: I1126 15:53:34.961446 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-utilities\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:34 crc kubenswrapper[5200]: I1126 15:53:34.962001 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-catalog-content\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:34 crc kubenswrapper[5200]: I1126 15:53:34.962298 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xqph\" (UniqueName: \"kubernetes.io/projected/f04545af-7b7a-4396-b6d4-1518eac00920-kube-api-access-2xqph\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.053822 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gtlhr"] Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.063270 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.065016 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-catalog-content\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.065174 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xqph\" (UniqueName: \"kubernetes.io/projected/f04545af-7b7a-4396-b6d4-1518eac00920-kube-api-access-2xqph\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.065454 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-utilities\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.065576 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-catalog-content\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.066147 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-utilities\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.071942 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtlhr"] Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.090561 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xqph\" (UniqueName: \"kubernetes.io/projected/f04545af-7b7a-4396-b6d4-1518eac00920-kube-api-access-2xqph\") pod \"community-operators-45jxc\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.167123 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-utilities\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.167466 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-catalog-content\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.167730 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkmwq\" (UniqueName: \"kubernetes.io/projected/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-kube-api-access-vkmwq\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.204656 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.269860 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkmwq\" (UniqueName: \"kubernetes.io/projected/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-kube-api-access-vkmwq\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.270226 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-utilities\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.270248 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-catalog-content\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.270936 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-utilities\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.270945 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-catalog-content\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.296421 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkmwq\" (UniqueName: \"kubernetes.io/projected/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-kube-api-access-vkmwq\") pod \"redhat-marketplace-gtlhr\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.388405 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.811471 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45jxc"] Nov 26 15:53:35 crc kubenswrapper[5200]: W1126 15:53:35.824792 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf04545af_7b7a_4396_b6d4_1518eac00920.slice/crio-a95a2b8970912d001a751d5e753d85236f67a576732394e337682a4a4b79465f WatchSource:0}: Error finding container a95a2b8970912d001a751d5e753d85236f67a576732394e337682a4a4b79465f: Status 404 returned error can't find the container with id a95a2b8970912d001a751d5e753d85236f67a576732394e337682a4a4b79465f Nov 26 15:53:35 crc kubenswrapper[5200]: I1126 15:53:35.937369 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtlhr"] Nov 26 15:53:36 crc kubenswrapper[5200]: I1126 15:53:36.405451 5200 generic.go:358] "Generic (PLEG): container finished" podID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerID="4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1" exitCode=0 Nov 26 15:53:36 crc kubenswrapper[5200]: I1126 15:53:36.405642 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtlhr" event={"ID":"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3","Type":"ContainerDied","Data":"4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1"} Nov 26 15:53:36 crc kubenswrapper[5200]: I1126 15:53:36.406175 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtlhr" event={"ID":"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3","Type":"ContainerStarted","Data":"ee04bb2c9c66b952a787723e93fcfe2243101a395b4b4ccd0dbcb7cfc6e82d80"} Nov 26 15:53:36 crc kubenswrapper[5200]: I1126 15:53:36.415315 5200 generic.go:358] "Generic (PLEG): container finished" podID="f04545af-7b7a-4396-b6d4-1518eac00920" containerID="912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d" exitCode=0 Nov 26 15:53:36 crc kubenswrapper[5200]: I1126 15:53:36.415399 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45jxc" event={"ID":"f04545af-7b7a-4396-b6d4-1518eac00920","Type":"ContainerDied","Data":"912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d"} Nov 26 15:53:36 crc kubenswrapper[5200]: I1126 15:53:36.415434 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45jxc" event={"ID":"f04545af-7b7a-4396-b6d4-1518eac00920","Type":"ContainerStarted","Data":"a95a2b8970912d001a751d5e753d85236f67a576732394e337682a4a4b79465f"} Nov 26 15:53:36 crc kubenswrapper[5200]: I1126 15:53:36.866467 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:36 crc kubenswrapper[5200]: I1126 15:53:36.866900 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:37 crc kubenswrapper[5200]: I1126 15:53:37.433629 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtlhr" event={"ID":"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3","Type":"ContainerStarted","Data":"a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc"} Nov 26 15:53:37 crc kubenswrapper[5200]: I1126 15:53:37.947547 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xpk9r" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="registry-server" probeResult="failure" output=< Nov 26 15:53:37 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 15:53:37 crc kubenswrapper[5200]: > Nov 26 15:53:38 crc kubenswrapper[5200]: I1126 15:53:38.444539 5200 generic.go:358] "Generic (PLEG): container finished" podID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerID="a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc" exitCode=0 Nov 26 15:53:38 crc kubenswrapper[5200]: I1126 15:53:38.444663 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtlhr" event={"ID":"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3","Type":"ContainerDied","Data":"a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc"} Nov 26 15:53:38 crc kubenswrapper[5200]: I1126 15:53:38.447771 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45jxc" event={"ID":"f04545af-7b7a-4396-b6d4-1518eac00920","Type":"ContainerStarted","Data":"fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91"} Nov 26 15:53:39 crc kubenswrapper[5200]: I1126 15:53:39.472348 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtlhr" event={"ID":"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3","Type":"ContainerStarted","Data":"1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f"} Nov 26 15:53:39 crc kubenswrapper[5200]: I1126 15:53:39.477932 5200 generic.go:358] "Generic (PLEG): container finished" podID="f04545af-7b7a-4396-b6d4-1518eac00920" containerID="fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91" exitCode=0 Nov 26 15:53:39 crc kubenswrapper[5200]: I1126 15:53:39.478635 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45jxc" event={"ID":"f04545af-7b7a-4396-b6d4-1518eac00920","Type":"ContainerDied","Data":"fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91"} Nov 26 15:53:39 crc kubenswrapper[5200]: I1126 15:53:39.504352 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gtlhr" podStartSLOduration=3.886080319 podStartE2EDuration="4.504329975s" podCreationTimestamp="2025-11-26 15:53:35 +0000 UTC" firstStartedPulling="2025-11-26 15:53:36.406738429 +0000 UTC m=+8585.085940257" lastFinishedPulling="2025-11-26 15:53:37.024988095 +0000 UTC m=+8585.704189913" observedRunningTime="2025-11-26 15:53:39.498055307 +0000 UTC m=+8588.177257195" watchObservedRunningTime="2025-11-26 15:53:39.504329975 +0000 UTC m=+8588.183531803" Nov 26 15:53:39 crc kubenswrapper[5200]: I1126 15:53:39.970576 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:53:39 crc kubenswrapper[5200]: E1126 15:53:39.971189 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:53:40 crc kubenswrapper[5200]: I1126 15:53:40.503082 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45jxc" event={"ID":"f04545af-7b7a-4396-b6d4-1518eac00920","Type":"ContainerStarted","Data":"00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79"} Nov 26 15:53:40 crc kubenswrapper[5200]: I1126 15:53:40.539605 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-45jxc" podStartSLOduration=5.58451583 podStartE2EDuration="6.539540196s" podCreationTimestamp="2025-11-26 15:53:34 +0000 UTC" firstStartedPulling="2025-11-26 15:53:36.416570852 +0000 UTC m=+8585.095772670" lastFinishedPulling="2025-11-26 15:53:37.371595218 +0000 UTC m=+8586.050797036" observedRunningTime="2025-11-26 15:53:40.525256714 +0000 UTC m=+8589.204458592" watchObservedRunningTime="2025-11-26 15:53:40.539540196 +0000 UTC m=+8589.218742044" Nov 26 15:53:42 crc kubenswrapper[5200]: E1126 15:53:42.758944 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2528482 actualBytes=10240 Nov 26 15:53:45 crc kubenswrapper[5200]: I1126 15:53:45.205430 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:45 crc kubenswrapper[5200]: I1126 15:53:45.205834 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:45 crc kubenswrapper[5200]: I1126 15:53:45.269404 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:45 crc kubenswrapper[5200]: I1126 15:53:45.390763 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:45 crc kubenswrapper[5200]: I1126 15:53:45.391026 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:45 crc kubenswrapper[5200]: I1126 15:53:45.455517 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:45 crc kubenswrapper[5200]: I1126 15:53:45.635986 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:45 crc kubenswrapper[5200]: I1126 15:53:45.642458 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:46 crc kubenswrapper[5200]: I1126 15:53:46.253412 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtlhr"] Nov 26 15:53:46 crc kubenswrapper[5200]: I1126 15:53:46.915146 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:46 crc kubenswrapper[5200]: I1126 15:53:46.981909 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:47 crc kubenswrapper[5200]: I1126 15:53:47.587671 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gtlhr" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerName="registry-server" containerID="cri-o://1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f" gracePeriod=2 Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.042823 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-45jxc"] Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.043305 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-45jxc" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" containerName="registry-server" containerID="cri-o://00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79" gracePeriod=2 Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.262284 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.459324 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-utilities\") pod \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.459680 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-catalog-content\") pod \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.460508 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkmwq\" (UniqueName: \"kubernetes.io/projected/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-kube-api-access-vkmwq\") pod \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\" (UID: \"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3\") " Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.460856 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-utilities" (OuterVolumeSpecName: "utilities") pod "d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" (UID: "d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.462044 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.468616 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-kube-api-access-vkmwq" (OuterVolumeSpecName: "kube-api-access-vkmwq") pod "d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" (UID: "d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3"). InnerVolumeSpecName "kube-api-access-vkmwq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.472263 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" (UID: "d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.522814 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.564114 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.564142 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vkmwq\" (UniqueName: \"kubernetes.io/projected/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3-kube-api-access-vkmwq\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.600046 5200 generic.go:358] "Generic (PLEG): container finished" podID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerID="1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f" exitCode=0 Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.600197 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtlhr" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.600198 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtlhr" event={"ID":"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3","Type":"ContainerDied","Data":"1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f"} Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.600325 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtlhr" event={"ID":"d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3","Type":"ContainerDied","Data":"ee04bb2c9c66b952a787723e93fcfe2243101a395b4b4ccd0dbcb7cfc6e82d80"} Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.600348 5200 scope.go:117] "RemoveContainer" containerID="1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.612578 5200 generic.go:358] "Generic (PLEG): container finished" podID="f04545af-7b7a-4396-b6d4-1518eac00920" containerID="00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79" exitCode=0 Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.612722 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45jxc" event={"ID":"f04545af-7b7a-4396-b6d4-1518eac00920","Type":"ContainerDied","Data":"00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79"} Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.612749 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45jxc" event={"ID":"f04545af-7b7a-4396-b6d4-1518eac00920","Type":"ContainerDied","Data":"a95a2b8970912d001a751d5e753d85236f67a576732394e337682a4a4b79465f"} Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.612842 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45jxc" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.641730 5200 scope.go:117] "RemoveContainer" containerID="a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.646244 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtlhr"] Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.654359 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtlhr"] Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.665876 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xqph\" (UniqueName: \"kubernetes.io/projected/f04545af-7b7a-4396-b6d4-1518eac00920-kube-api-access-2xqph\") pod \"f04545af-7b7a-4396-b6d4-1518eac00920\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.665990 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-catalog-content\") pod \"f04545af-7b7a-4396-b6d4-1518eac00920\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.666331 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-utilities\") pod \"f04545af-7b7a-4396-b6d4-1518eac00920\" (UID: \"f04545af-7b7a-4396-b6d4-1518eac00920\") " Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.668125 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-utilities" (OuterVolumeSpecName: "utilities") pod "f04545af-7b7a-4396-b6d4-1518eac00920" (UID: "f04545af-7b7a-4396-b6d4-1518eac00920"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.671622 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04545af-7b7a-4396-b6d4-1518eac00920-kube-api-access-2xqph" (OuterVolumeSpecName: "kube-api-access-2xqph") pod "f04545af-7b7a-4396-b6d4-1518eac00920" (UID: "f04545af-7b7a-4396-b6d4-1518eac00920"). InnerVolumeSpecName "kube-api-access-2xqph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.673606 5200 scope.go:117] "RemoveContainer" containerID="4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.691866 5200 scope.go:117] "RemoveContainer" containerID="1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f" Nov 26 15:53:48 crc kubenswrapper[5200]: E1126 15:53:48.692803 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f\": container with ID starting with 1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f not found: ID does not exist" containerID="1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.692842 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f"} err="failed to get container status \"1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f\": rpc error: code = NotFound desc = could not find container \"1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f\": container with ID starting with 1cedf464b829127125ec38cc15dde635bceb0b26b59312af9050f48fa196839f not found: ID does not exist" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.692863 5200 scope.go:117] "RemoveContainer" containerID="a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc" Nov 26 15:53:48 crc kubenswrapper[5200]: E1126 15:53:48.693399 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc\": container with ID starting with a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc not found: ID does not exist" containerID="a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.693423 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc"} err="failed to get container status \"a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc\": rpc error: code = NotFound desc = could not find container \"a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc\": container with ID starting with a8c92caa0f56e6d15eb91bc98172f2e02e0a36dc311ff2e8bb4c4e7eb345d7bc not found: ID does not exist" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.693436 5200 scope.go:117] "RemoveContainer" containerID="4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1" Nov 26 15:53:48 crc kubenswrapper[5200]: E1126 15:53:48.693677 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1\": container with ID starting with 4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1 not found: ID does not exist" containerID="4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.693738 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1"} err="failed to get container status \"4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1\": rpc error: code = NotFound desc = could not find container \"4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1\": container with ID starting with 4dc4ed4bd7d4435ee7428030787e238d50f23dd6edfddbedb71cbd4c71256de1 not found: ID does not exist" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.693752 5200 scope.go:117] "RemoveContainer" containerID="00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.715134 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f04545af-7b7a-4396-b6d4-1518eac00920" (UID: "f04545af-7b7a-4396-b6d4-1518eac00920"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.768973 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2xqph\" (UniqueName: \"kubernetes.io/projected/f04545af-7b7a-4396-b6d4-1518eac00920-kube-api-access-2xqph\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.769005 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.769015 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04545af-7b7a-4396-b6d4-1518eac00920-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.772090 5200 scope.go:117] "RemoveContainer" containerID="fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.809187 5200 scope.go:117] "RemoveContainer" containerID="912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.851837 5200 scope.go:117] "RemoveContainer" containerID="00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79" Nov 26 15:53:48 crc kubenswrapper[5200]: E1126 15:53:48.852307 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79\": container with ID starting with 00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79 not found: ID does not exist" containerID="00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.852457 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79"} err="failed to get container status \"00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79\": rpc error: code = NotFound desc = could not find container \"00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79\": container with ID starting with 00298729d273738857581db5025e798b4a4a7ff7787a64cebdc96aa86be56d79 not found: ID does not exist" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.852587 5200 scope.go:117] "RemoveContainer" containerID="fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91" Nov 26 15:53:48 crc kubenswrapper[5200]: E1126 15:53:48.853272 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91\": container with ID starting with fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91 not found: ID does not exist" containerID="fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.853445 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91"} err="failed to get container status \"fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91\": rpc error: code = NotFound desc = could not find container \"fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91\": container with ID starting with fafd60cd08e1902e7b3c0c8a203d106ad9daa71d1d962e25dcdbe1a221445e91 not found: ID does not exist" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.853547 5200 scope.go:117] "RemoveContainer" containerID="912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d" Nov 26 15:53:48 crc kubenswrapper[5200]: E1126 15:53:48.853943 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d\": container with ID starting with 912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d not found: ID does not exist" containerID="912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.853978 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d"} err="failed to get container status \"912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d\": rpc error: code = NotFound desc = could not find container \"912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d\": container with ID starting with 912a3f488c27fc48ff5afe4629102a6bd194fb93f47a92ccf5946737a79da83d not found: ID does not exist" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.987293 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" path="/var/lib/kubelet/pods/d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3/volumes" Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.988375 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-45jxc"] Nov 26 15:53:48 crc kubenswrapper[5200]: I1126 15:53:48.988411 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-45jxc"] Nov 26 15:53:50 crc kubenswrapper[5200]: I1126 15:53:50.443376 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xpk9r"] Nov 26 15:53:50 crc kubenswrapper[5200]: I1126 15:53:50.444062 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xpk9r" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="registry-server" containerID="cri-o://7f232df270c3c8b3383f7c8cd905c73bfcb3c850ea36a64fc234aa51b8d081d6" gracePeriod=2 Nov 26 15:53:50 crc kubenswrapper[5200]: I1126 15:53:50.649134 5200 generic.go:358] "Generic (PLEG): container finished" podID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerID="7f232df270c3c8b3383f7c8cd905c73bfcb3c850ea36a64fc234aa51b8d081d6" exitCode=0 Nov 26 15:53:50 crc kubenswrapper[5200]: I1126 15:53:50.649801 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpk9r" event={"ID":"9943c6ab-d03f-4613-8f42-b1470d82478a","Type":"ContainerDied","Data":"7f232df270c3c8b3383f7c8cd905c73bfcb3c850ea36a64fc234aa51b8d081d6"} Nov 26 15:53:50 crc kubenswrapper[5200]: I1126 15:53:50.989555 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" path="/var/lib/kubelet/pods/f04545af-7b7a-4396-b6d4-1518eac00920/volumes" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.061669 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.131977 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-utilities\") pod \"9943c6ab-d03f-4613-8f42-b1470d82478a\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.132335 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-catalog-content\") pod \"9943c6ab-d03f-4613-8f42-b1470d82478a\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.133831 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxcdf\" (UniqueName: \"kubernetes.io/projected/9943c6ab-d03f-4613-8f42-b1470d82478a-kube-api-access-qxcdf\") pod \"9943c6ab-d03f-4613-8f42-b1470d82478a\" (UID: \"9943c6ab-d03f-4613-8f42-b1470d82478a\") " Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.134000 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-utilities" (OuterVolumeSpecName: "utilities") pod "9943c6ab-d03f-4613-8f42-b1470d82478a" (UID: "9943c6ab-d03f-4613-8f42-b1470d82478a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.135200 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.147126 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9943c6ab-d03f-4613-8f42-b1470d82478a-kube-api-access-qxcdf" (OuterVolumeSpecName: "kube-api-access-qxcdf") pod "9943c6ab-d03f-4613-8f42-b1470d82478a" (UID: "9943c6ab-d03f-4613-8f42-b1470d82478a"). InnerVolumeSpecName "kube-api-access-qxcdf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.219505 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9943c6ab-d03f-4613-8f42-b1470d82478a" (UID: "9943c6ab-d03f-4613-8f42-b1470d82478a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.236921 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9943c6ab-d03f-4613-8f42-b1470d82478a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.237108 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qxcdf\" (UniqueName: \"kubernetes.io/projected/9943c6ab-d03f-4613-8f42-b1470d82478a-kube-api-access-qxcdf\") on node \"crc\" DevicePath \"\"" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.666648 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpk9r" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.666718 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpk9r" event={"ID":"9943c6ab-d03f-4613-8f42-b1470d82478a","Type":"ContainerDied","Data":"252c242c8efc87367067a7e05d32d4e39d6bc21097527f30ed83744969deaca2"} Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.666822 5200 scope.go:117] "RemoveContainer" containerID="7f232df270c3c8b3383f7c8cd905c73bfcb3c850ea36a64fc234aa51b8d081d6" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.702527 5200 scope.go:117] "RemoveContainer" containerID="0c40f4ba7ec0d736377641db3885f02a93b7cb4d7746911ae87e78db9d498e99" Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.714551 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xpk9r"] Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.723324 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xpk9r"] Nov 26 15:53:51 crc kubenswrapper[5200]: I1126 15:53:51.750292 5200 scope.go:117] "RemoveContainer" containerID="8adef83aae679ab27ca8bdc8dc67ea0015697a2fddc72b662326c4990ec9afc1" Nov 26 15:53:52 crc kubenswrapper[5200]: I1126 15:53:52.991928 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" path="/var/lib/kubelet/pods/9943c6ab-d03f-4613-8f42-b1470d82478a/volumes" Nov 26 15:53:53 crc kubenswrapper[5200]: I1126 15:53:53.969448 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:53:53 crc kubenswrapper[5200]: E1126 15:53:53.970034 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:54:08 crc kubenswrapper[5200]: I1126 15:54:08.970245 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:54:08 crc kubenswrapper[5200]: E1126 15:54:08.971421 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:54:20 crc kubenswrapper[5200]: I1126 15:54:20.970346 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:54:20 crc kubenswrapper[5200]: E1126 15:54:20.973869 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:54:32 crc kubenswrapper[5200]: I1126 15:54:32.994235 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:54:32 crc kubenswrapper[5200]: E1126 15:54:32.995765 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:54:42 crc kubenswrapper[5200]: E1126 15:54:42.857646 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443165 actualBytes=10240 Nov 26 15:54:46 crc kubenswrapper[5200]: I1126 15:54:46.978455 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:54:46 crc kubenswrapper[5200]: E1126 15:54:46.979242 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:55:00 crc kubenswrapper[5200]: I1126 15:55:00.970478 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:55:00 crc kubenswrapper[5200]: E1126 15:55:00.973637 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:55:13 crc kubenswrapper[5200]: I1126 15:55:13.968979 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:55:13 crc kubenswrapper[5200]: E1126 15:55:13.969715 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:55:27 crc kubenswrapper[5200]: I1126 15:55:27.969902 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:55:27 crc kubenswrapper[5200]: E1126 15:55:27.970632 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:55:38 crc kubenswrapper[5200]: I1126 15:55:38.969085 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:55:38 crc kubenswrapper[5200]: E1126 15:55:38.970008 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:55:39 crc kubenswrapper[5200]: I1126 15:55:39.105507 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:55:39 crc kubenswrapper[5200]: I1126 15:55:39.110301 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:55:39 crc kubenswrapper[5200]: I1126 15:55:39.110560 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 15:55:39 crc kubenswrapper[5200]: I1126 15:55:39.114327 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 15:55:42 crc kubenswrapper[5200]: E1126 15:55:42.963441 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443164 actualBytes=10240 Nov 26 15:55:51 crc kubenswrapper[5200]: I1126 15:55:51.970313 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:55:51 crc kubenswrapper[5200]: E1126 15:55:51.971846 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:56:06 crc kubenswrapper[5200]: I1126 15:56:06.970207 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:56:06 crc kubenswrapper[5200]: E1126 15:56:06.971350 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:56:21 crc kubenswrapper[5200]: I1126 15:56:21.970341 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:56:21 crc kubenswrapper[5200]: E1126 15:56:21.971928 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:56:33 crc kubenswrapper[5200]: I1126 15:56:33.970879 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:56:33 crc kubenswrapper[5200]: E1126 15:56:33.972088 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:56:43 crc kubenswrapper[5200]: E1126 15:56:43.171222 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441843 actualBytes=10240 Nov 26 15:56:44 crc kubenswrapper[5200]: I1126 15:56:44.971210 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:56:44 crc kubenswrapper[5200]: E1126 15:56:44.971984 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:56:58 crc kubenswrapper[5200]: I1126 15:56:58.971207 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:56:58 crc kubenswrapper[5200]: E1126 15:56:58.972827 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:57:10 crc kubenswrapper[5200]: I1126 15:57:10.970484 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:57:10 crc kubenswrapper[5200]: E1126 15:57:10.971871 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:57:24 crc kubenswrapper[5200]: I1126 15:57:24.970322 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:57:24 crc kubenswrapper[5200]: E1126 15:57:24.972555 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 15:57:38 crc kubenswrapper[5200]: I1126 15:57:38.969471 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 15:57:40 crc kubenswrapper[5200]: I1126 15:57:40.043577 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"b30f1781ca3e63f9c10a22308af73f98c879bc00933c8c661bd0c8a5a54f6c4a"} Nov 26 15:57:43 crc kubenswrapper[5200]: E1126 15:57:43.039627 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441837 actualBytes=10240 Nov 26 15:58:43 crc kubenswrapper[5200]: E1126 15:58:43.259551 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441841 actualBytes=10240 Nov 26 15:58:43 crc kubenswrapper[5200]: I1126 15:58:43.921328 5200 generic.go:358] "Generic (PLEG): container finished" podID="f2843638-a976-41b5-b129-9186db7b9257" containerID="0d364cbab8589a85c8904cc4dff9f198d2d993961db0a76716b22cbccd4e0ee6" exitCode=0 Nov 26 15:58:43 crc kubenswrapper[5200]: I1126 15:58:43.921495 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" event={"ID":"f2843638-a976-41b5-b129-9186db7b9257","Type":"ContainerDied","Data":"0d364cbab8589a85c8904cc4dff9f198d2d993961db0a76716b22cbccd4e0ee6"} Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.549869 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.593290 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ceph\") pod \"f2843638-a976-41b5-b129-9186db7b9257\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.593353 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-combined-ca-bundle\") pod \"f2843638-a976-41b5-b129-9186db7b9257\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.593447 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ssh-key\") pod \"f2843638-a976-41b5-b129-9186db7b9257\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.593483 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stnmt\" (UniqueName: \"kubernetes.io/projected/f2843638-a976-41b5-b129-9186db7b9257-kube-api-access-stnmt\") pod \"f2843638-a976-41b5-b129-9186db7b9257\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.593522 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-inventory\") pod \"f2843638-a976-41b5-b129-9186db7b9257\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.593897 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-agent-neutron-config-0\") pod \"f2843638-a976-41b5-b129-9186db7b9257\" (UID: \"f2843638-a976-41b5-b129-9186db7b9257\") " Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.601792 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "f2843638-a976-41b5-b129-9186db7b9257" (UID: "f2843638-a976-41b5-b129-9186db7b9257"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.615946 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ceph" (OuterVolumeSpecName: "ceph") pod "f2843638-a976-41b5-b129-9186db7b9257" (UID: "f2843638-a976-41b5-b129-9186db7b9257"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.617987 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2843638-a976-41b5-b129-9186db7b9257-kube-api-access-stnmt" (OuterVolumeSpecName: "kube-api-access-stnmt") pod "f2843638-a976-41b5-b129-9186db7b9257" (UID: "f2843638-a976-41b5-b129-9186db7b9257"). InnerVolumeSpecName "kube-api-access-stnmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.626556 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "f2843638-a976-41b5-b129-9186db7b9257" (UID: "f2843638-a976-41b5-b129-9186db7b9257"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.631640 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2843638-a976-41b5-b129-9186db7b9257" (UID: "f2843638-a976-41b5-b129-9186db7b9257"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.638136 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-inventory" (OuterVolumeSpecName: "inventory") pod "f2843638-a976-41b5-b129-9186db7b9257" (UID: "f2843638-a976-41b5-b129-9186db7b9257"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.697655 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.697695 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.697706 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.697717 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.697726 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-stnmt\" (UniqueName: \"kubernetes.io/projected/f2843638-a976-41b5-b129-9186db7b9257-kube-api-access-stnmt\") on node \"crc\" DevicePath \"\"" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.697736 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2843638-a976-41b5-b129-9186db7b9257-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.952725 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" event={"ID":"f2843638-a976-41b5-b129-9186db7b9257","Type":"ContainerDied","Data":"d011c40c426dc8d623f388c7f7f3c5f1c7c63d93e3adf5009e8253699fb43b01"} Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.952800 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d011c40c426dc8d623f388c7f7f3c5f1c7c63d93e3adf5009e8253699fb43b01" Nov 26 15:58:45 crc kubenswrapper[5200]: I1126 15:58:45.952672 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dk8vz" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.083824 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9"] Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085457 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerName="extract-content" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085482 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerName="extract-content" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085495 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="extract-content" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085503 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="extract-content" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085522 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" containerName="extract-utilities" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085532 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" containerName="extract-utilities" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085563 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085571 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085585 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085593 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085612 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" containerName="extract-content" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085620 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" containerName="extract-content" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085648 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="extract-utilities" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085655 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="extract-utilities" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085677 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerName="extract-utilities" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085700 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerName="extract-utilities" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085728 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2843638-a976-41b5-b129-9186db7b9257" containerName="neutron-sriov-openstack-openstack-cell1" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085736 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2843638-a976-41b5-b129-9186db7b9257" containerName="neutron-sriov-openstack-openstack-cell1" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085767 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.085774 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.086024 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f04545af-7b7a-4396-b6d4-1518eac00920" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.086045 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f2bdc4-8cb2-4b9c-8e44-fb11ec64dad3" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.086058 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9943c6ab-d03f-4613-8f42-b1470d82478a" containerName="registry-server" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.086068 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2843638-a976-41b5-b129-9186db7b9257" containerName="neutron-sriov-openstack-openstack-cell1" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.095640 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9"] Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.095771 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.102506 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.102613 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.103129 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-dhcp-agent-neutron-config\"" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.102829 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.102708 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.117778 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.117839 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.117876 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.117939 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.118135 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47lzx\" (UniqueName: \"kubernetes.io/projected/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-kube-api-access-47lzx\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.118209 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.220216 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47lzx\" (UniqueName: \"kubernetes.io/projected/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-kube-api-access-47lzx\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.220331 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.220537 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.220604 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.220649 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.220790 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.228107 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.228755 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.228919 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.229369 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.234470 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.262980 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47lzx\" (UniqueName: \"kubernetes.io/projected/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-kube-api-access-47lzx\") pod \"neutron-dhcp-openstack-openstack-cell1-5fjf9\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.420428 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.985534 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9"] Nov 26 15:58:46 crc kubenswrapper[5200]: I1126 15:58:46.988194 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:58:47 crc kubenswrapper[5200]: I1126 15:58:47.983189 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" event={"ID":"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94","Type":"ContainerStarted","Data":"c402bdbc1041b798587d42b5425dd3a54d9c9089d140de97d03f1feaeab4d923"} Nov 26 15:58:49 crc kubenswrapper[5200]: I1126 15:58:49.000012 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" event={"ID":"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94","Type":"ContainerStarted","Data":"a4f4c8e37b7ef2a75f21c29686ad725ed2fcd579830200ae27a03a37dd90279e"} Nov 26 15:58:49 crc kubenswrapper[5200]: I1126 15:58:49.028397 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" podStartSLOduration=2.240920451 podStartE2EDuration="3.028372848s" podCreationTimestamp="2025-11-26 15:58:46 +0000 UTC" firstStartedPulling="2025-11-26 15:58:46.988364263 +0000 UTC m=+8895.667566081" lastFinishedPulling="2025-11-26 15:58:47.77581664 +0000 UTC m=+8896.455018478" observedRunningTime="2025-11-26 15:58:49.01840763 +0000 UTC m=+8897.697609538" watchObservedRunningTime="2025-11-26 15:58:49.028372848 +0000 UTC m=+8897.707574696" Nov 26 15:59:44 crc kubenswrapper[5200]: E1126 15:59:44.712487 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441808 actualBytes=10240 Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.152463 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw"] Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.161722 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.164135 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.165242 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.181887 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw"] Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.269301 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bf9ef89-1f8a-4772-9944-316e115377ad-config-volume\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.269350 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6bf9ef89-1f8a-4772-9944-316e115377ad-secret-volume\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.269522 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hmrh\" (UniqueName: \"kubernetes.io/projected/6bf9ef89-1f8a-4772-9944-316e115377ad-kube-api-access-2hmrh\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.371864 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hmrh\" (UniqueName: \"kubernetes.io/projected/6bf9ef89-1f8a-4772-9944-316e115377ad-kube-api-access-2hmrh\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.372325 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bf9ef89-1f8a-4772-9944-316e115377ad-config-volume\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.372508 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6bf9ef89-1f8a-4772-9944-316e115377ad-secret-volume\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.373200 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bf9ef89-1f8a-4772-9944-316e115377ad-config-volume\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.382084 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6bf9ef89-1f8a-4772-9944-316e115377ad-secret-volume\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.403264 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hmrh\" (UniqueName: \"kubernetes.io/projected/6bf9ef89-1f8a-4772-9944-316e115377ad-kube-api-access-2hmrh\") pod \"collect-profiles-29402880-msrpw\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:00 crc kubenswrapper[5200]: I1126 16:00:00.500918 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:01 crc kubenswrapper[5200]: I1126 16:00:01.025020 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw"] Nov 26 16:00:02 crc kubenswrapper[5200]: I1126 16:00:02.004672 5200 generic.go:358] "Generic (PLEG): container finished" podID="6bf9ef89-1f8a-4772-9944-316e115377ad" containerID="0e7b09de55d6d03533f6766bfd0b3a2fc75e88bc6a85c9d1899e715459de6454" exitCode=0 Nov 26 16:00:02 crc kubenswrapper[5200]: I1126 16:00:02.005306 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" event={"ID":"6bf9ef89-1f8a-4772-9944-316e115377ad","Type":"ContainerDied","Data":"0e7b09de55d6d03533f6766bfd0b3a2fc75e88bc6a85c9d1899e715459de6454"} Nov 26 16:00:02 crc kubenswrapper[5200]: I1126 16:00:02.005334 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" event={"ID":"6bf9ef89-1f8a-4772-9944-316e115377ad","Type":"ContainerStarted","Data":"5dd96f553dec08455591cba736650e2be13b434bfc552411d2447d086b5f99b3"} Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.154746 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.155081 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.411614 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.554298 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bf9ef89-1f8a-4772-9944-316e115377ad-config-volume\") pod \"6bf9ef89-1f8a-4772-9944-316e115377ad\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.554561 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hmrh\" (UniqueName: \"kubernetes.io/projected/6bf9ef89-1f8a-4772-9944-316e115377ad-kube-api-access-2hmrh\") pod \"6bf9ef89-1f8a-4772-9944-316e115377ad\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.554625 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6bf9ef89-1f8a-4772-9944-316e115377ad-secret-volume\") pod \"6bf9ef89-1f8a-4772-9944-316e115377ad\" (UID: \"6bf9ef89-1f8a-4772-9944-316e115377ad\") " Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.555015 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf9ef89-1f8a-4772-9944-316e115377ad-config-volume" (OuterVolumeSpecName: "config-volume") pod "6bf9ef89-1f8a-4772-9944-316e115377ad" (UID: "6bf9ef89-1f8a-4772-9944-316e115377ad"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.555177 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bf9ef89-1f8a-4772-9944-316e115377ad-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.562119 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf9ef89-1f8a-4772-9944-316e115377ad-kube-api-access-2hmrh" (OuterVolumeSpecName: "kube-api-access-2hmrh") pod "6bf9ef89-1f8a-4772-9944-316e115377ad" (UID: "6bf9ef89-1f8a-4772-9944-316e115377ad"). InnerVolumeSpecName "kube-api-access-2hmrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.562232 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf9ef89-1f8a-4772-9944-316e115377ad-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6bf9ef89-1f8a-4772-9944-316e115377ad" (UID: "6bf9ef89-1f8a-4772-9944-316e115377ad"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.657107 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2hmrh\" (UniqueName: \"kubernetes.io/projected/6bf9ef89-1f8a-4772-9944-316e115377ad-kube-api-access-2hmrh\") on node \"crc\" DevicePath \"\"" Nov 26 16:00:03 crc kubenswrapper[5200]: I1126 16:00:03.657136 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6bf9ef89-1f8a-4772-9944-316e115377ad-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 16:00:04 crc kubenswrapper[5200]: I1126 16:00:04.033734 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" Nov 26 16:00:04 crc kubenswrapper[5200]: I1126 16:00:04.033759 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402880-msrpw" event={"ID":"6bf9ef89-1f8a-4772-9944-316e115377ad","Type":"ContainerDied","Data":"5dd96f553dec08455591cba736650e2be13b434bfc552411d2447d086b5f99b3"} Nov 26 16:00:04 crc kubenswrapper[5200]: I1126 16:00:04.034281 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd96f553dec08455591cba736650e2be13b434bfc552411d2447d086b5f99b3" Nov 26 16:00:04 crc kubenswrapper[5200]: I1126 16:00:04.508956 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k"] Nov 26 16:00:04 crc kubenswrapper[5200]: I1126 16:00:04.520875 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-g954k"] Nov 26 16:00:04 crc kubenswrapper[5200]: I1126 16:00:04.992485 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="042eba1e-d0a8-4058-ac19-3ead5f19889a" path="/var/lib/kubelet/pods/042eba1e-d0a8-4058-ac19-3ead5f19889a/volumes" Nov 26 16:00:33 crc kubenswrapper[5200]: I1126 16:00:33.154295 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:00:33 crc kubenswrapper[5200]: I1126 16:00:33.155086 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:00:35 crc kubenswrapper[5200]: I1126 16:00:35.672224 5200 scope.go:117] "RemoveContainer" containerID="2da214eb36e3fe4a2cca0797ccdb3f8078f1a9af42f0aff0c1c54d64c068e49d" Nov 26 16:00:39 crc kubenswrapper[5200]: I1126 16:00:39.333710 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:00:39 crc kubenswrapper[5200]: I1126 16:00:39.337269 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:00:39 crc kubenswrapper[5200]: I1126 16:00:39.340154 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:00:39 crc kubenswrapper[5200]: I1126 16:00:39.343235 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:00:43 crc kubenswrapper[5200]: E1126 16:00:43.026238 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441799 actualBytes=10240 Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.155628 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29402881-4vvr6"] Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.157388 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bf9ef89-1f8a-4772-9944-316e115377ad" containerName="collect-profiles" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.157406 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf9ef89-1f8a-4772-9944-316e115377ad" containerName="collect-profiles" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.157670 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bf9ef89-1f8a-4772-9944-316e115377ad" containerName="collect-profiles" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.181295 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29402881-4vvr6"] Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.181428 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.247354 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znhsb\" (UniqueName: \"kubernetes.io/projected/1c323096-3357-46d1-b42c-d5d0f9df9550-kube-api-access-znhsb\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.247855 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-combined-ca-bundle\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.248208 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-config-data\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.248546 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-fernet-keys\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.350936 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znhsb\" (UniqueName: \"kubernetes.io/projected/1c323096-3357-46d1-b42c-d5d0f9df9550-kube-api-access-znhsb\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.351356 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-combined-ca-bundle\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.351728 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-config-data\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.352079 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-fernet-keys\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.365319 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-combined-ca-bundle\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.365344 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-config-data\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.366134 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-fernet-keys\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.386364 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znhsb\" (UniqueName: \"kubernetes.io/projected/1c323096-3357-46d1-b42c-d5d0f9df9550-kube-api-access-znhsb\") pod \"keystone-cron-29402881-4vvr6\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:00 crc kubenswrapper[5200]: I1126 16:01:00.501111 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:01 crc kubenswrapper[5200]: I1126 16:01:01.057065 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29402881-4vvr6"] Nov 26 16:01:01 crc kubenswrapper[5200]: I1126 16:01:01.789022 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402881-4vvr6" event={"ID":"1c323096-3357-46d1-b42c-d5d0f9df9550","Type":"ContainerStarted","Data":"b1e798d517a96d2beb2068e0fd29f4938bf1561d2a096c51ea63db9859a988e0"} Nov 26 16:01:01 crc kubenswrapper[5200]: I1126 16:01:01.789342 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402881-4vvr6" event={"ID":"1c323096-3357-46d1-b42c-d5d0f9df9550","Type":"ContainerStarted","Data":"060ab915afa1a7c65c0333fec0a3d701fe13e30dbf56149d22b298e7f1b5daee"} Nov 26 16:01:01 crc kubenswrapper[5200]: I1126 16:01:01.818332 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29402881-4vvr6" podStartSLOduration=1.818305042 podStartE2EDuration="1.818305042s" podCreationTimestamp="2025-11-26 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 16:01:01.808584482 +0000 UTC m=+9030.487786370" watchObservedRunningTime="2025-11-26 16:01:01.818305042 +0000 UTC m=+9030.497506890" Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.154843 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.155254 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.155313 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.156425 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b30f1781ca3e63f9c10a22308af73f98c879bc00933c8c661bd0c8a5a54f6c4a"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.156508 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://b30f1781ca3e63f9c10a22308af73f98c879bc00933c8c661bd0c8a5a54f6c4a" gracePeriod=600 Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.826652 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="b30f1781ca3e63f9c10a22308af73f98c879bc00933c8c661bd0c8a5a54f6c4a" exitCode=0 Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.826793 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"b30f1781ca3e63f9c10a22308af73f98c879bc00933c8c661bd0c8a5a54f6c4a"} Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.828456 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73"} Nov 26 16:01:03 crc kubenswrapper[5200]: I1126 16:01:03.828503 5200 scope.go:117] "RemoveContainer" containerID="e246f1076b95b8545cc31aef1b4fdadea29485f2983dcfeb1c544fcce97ff2a3" Nov 26 16:01:04 crc kubenswrapper[5200]: I1126 16:01:04.848489 5200 generic.go:358] "Generic (PLEG): container finished" podID="1c323096-3357-46d1-b42c-d5d0f9df9550" containerID="b1e798d517a96d2beb2068e0fd29f4938bf1561d2a096c51ea63db9859a988e0" exitCode=0 Nov 26 16:01:04 crc kubenswrapper[5200]: I1126 16:01:04.848581 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402881-4vvr6" event={"ID":"1c323096-3357-46d1-b42c-d5d0f9df9550","Type":"ContainerDied","Data":"b1e798d517a96d2beb2068e0fd29f4938bf1561d2a096c51ea63db9859a988e0"} Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.244239 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.306320 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-fernet-keys\") pod \"1c323096-3357-46d1-b42c-d5d0f9df9550\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.306415 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znhsb\" (UniqueName: \"kubernetes.io/projected/1c323096-3357-46d1-b42c-d5d0f9df9550-kube-api-access-znhsb\") pod \"1c323096-3357-46d1-b42c-d5d0f9df9550\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.306490 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-config-data\") pod \"1c323096-3357-46d1-b42c-d5d0f9df9550\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.306754 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-combined-ca-bundle\") pod \"1c323096-3357-46d1-b42c-d5d0f9df9550\" (UID: \"1c323096-3357-46d1-b42c-d5d0f9df9550\") " Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.313244 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1c323096-3357-46d1-b42c-d5d0f9df9550" (UID: "1c323096-3357-46d1-b42c-d5d0f9df9550"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.313333 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c323096-3357-46d1-b42c-d5d0f9df9550-kube-api-access-znhsb" (OuterVolumeSpecName: "kube-api-access-znhsb") pod "1c323096-3357-46d1-b42c-d5d0f9df9550" (UID: "1c323096-3357-46d1-b42c-d5d0f9df9550"). InnerVolumeSpecName "kube-api-access-znhsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.337978 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c323096-3357-46d1-b42c-d5d0f9df9550" (UID: "1c323096-3357-46d1-b42c-d5d0f9df9550"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.375263 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-config-data" (OuterVolumeSpecName: "config-data") pod "1c323096-3357-46d1-b42c-d5d0f9df9550" (UID: "1c323096-3357-46d1-b42c-d5d0f9df9550"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.409243 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znhsb\" (UniqueName: \"kubernetes.io/projected/1c323096-3357-46d1-b42c-d5d0f9df9550-kube-api-access-znhsb\") on node \"crc\" DevicePath \"\"" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.409273 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.409282 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.409289 5200 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c323096-3357-46d1-b42c-d5d0f9df9550-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.871936 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29402881-4vvr6" Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.871961 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29402881-4vvr6" event={"ID":"1c323096-3357-46d1-b42c-d5d0f9df9550","Type":"ContainerDied","Data":"060ab915afa1a7c65c0333fec0a3d701fe13e30dbf56149d22b298e7f1b5daee"} Nov 26 16:01:06 crc kubenswrapper[5200]: I1126 16:01:06.871998 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="060ab915afa1a7c65c0333fec0a3d701fe13e30dbf56149d22b298e7f1b5daee" Nov 26 16:01:43 crc kubenswrapper[5200]: E1126 16:01:43.121103 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443123 actualBytes=10240 Nov 26 16:02:43 crc kubenswrapper[5200]: E1126 16:02:43.305409 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443122 actualBytes=10240 Nov 26 16:03:03 crc kubenswrapper[5200]: I1126 16:03:03.154811 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:03:03 crc kubenswrapper[5200]: I1126 16:03:03.155417 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:03:33 crc kubenswrapper[5200]: I1126 16:03:33.154565 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:03:33 crc kubenswrapper[5200]: I1126 16:03:33.155215 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:03:42 crc kubenswrapper[5200]: E1126 16:03:42.839839 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443124 actualBytes=10240 Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.735923 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6gng8"] Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.739009 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c323096-3357-46d1-b42c-d5d0f9df9550" containerName="keystone-cron" Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.739048 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c323096-3357-46d1-b42c-d5d0f9df9550" containerName="keystone-cron" Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.739425 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c323096-3357-46d1-b42c-d5d0f9df9550" containerName="keystone-cron" Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.766191 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6gng8"] Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.766372 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.897626 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-utilities\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.897914 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h999h\" (UniqueName: \"kubernetes.io/projected/d05ff412-12a3-44e4-af51-30771eb833ef-kube-api-access-h999h\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:43 crc kubenswrapper[5200]: I1126 16:03:43.898640 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-catalog-content\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.001084 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-catalog-content\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.001142 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-utilities\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.001214 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h999h\" (UniqueName: \"kubernetes.io/projected/d05ff412-12a3-44e4-af51-30771eb833ef-kube-api-access-h999h\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.001550 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-catalog-content\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.001675 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-utilities\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.034810 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h999h\" (UniqueName: \"kubernetes.io/projected/d05ff412-12a3-44e4-af51-30771eb833ef-kube-api-access-h999h\") pod \"redhat-operators-6gng8\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.094420 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.590862 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6gng8"] Nov 26 16:03:44 crc kubenswrapper[5200]: I1126 16:03:44.896664 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gng8" event={"ID":"d05ff412-12a3-44e4-af51-30771eb833ef","Type":"ContainerStarted","Data":"98b76df0f604473483e0dcae80793c7f78a7df47ec533ce45483dfc1e4946fe1"} Nov 26 16:03:45 crc kubenswrapper[5200]: I1126 16:03:45.912710 5200 generic.go:358] "Generic (PLEG): container finished" podID="d05ff412-12a3-44e4-af51-30771eb833ef" containerID="ee4ab94fb06056bcb0884088822d344316aa30b99a14f24d657eba878c650a88" exitCode=0 Nov 26 16:03:45 crc kubenswrapper[5200]: I1126 16:03:45.914268 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gng8" event={"ID":"d05ff412-12a3-44e4-af51-30771eb833ef","Type":"ContainerDied","Data":"ee4ab94fb06056bcb0884088822d344316aa30b99a14f24d657eba878c650a88"} Nov 26 16:03:47 crc kubenswrapper[5200]: I1126 16:03:47.946583 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gng8" event={"ID":"d05ff412-12a3-44e4-af51-30771eb833ef","Type":"ContainerStarted","Data":"e19f71eae3ae2b484da91be69325490e02b3bc2824a9381f0b9d3368122589d2"} Nov 26 16:03:52 crc kubenswrapper[5200]: I1126 16:03:52.108891 5200 generic.go:358] "Generic (PLEG): container finished" podID="d05ff412-12a3-44e4-af51-30771eb833ef" containerID="e19f71eae3ae2b484da91be69325490e02b3bc2824a9381f0b9d3368122589d2" exitCode=0 Nov 26 16:03:52 crc kubenswrapper[5200]: I1126 16:03:52.108953 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gng8" event={"ID":"d05ff412-12a3-44e4-af51-30771eb833ef","Type":"ContainerDied","Data":"e19f71eae3ae2b484da91be69325490e02b3bc2824a9381f0b9d3368122589d2"} Nov 26 16:03:52 crc kubenswrapper[5200]: I1126 16:03:52.112332 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 16:03:53 crc kubenswrapper[5200]: I1126 16:03:53.121967 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gng8" event={"ID":"d05ff412-12a3-44e4-af51-30771eb833ef","Type":"ContainerStarted","Data":"40aae5f486fbd2449b04c2444de0d6a2e59c41bf179497c394e47eda9df1ad56"} Nov 26 16:03:53 crc kubenswrapper[5200]: I1126 16:03:53.147117 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6gng8" podStartSLOduration=8.594515419 podStartE2EDuration="10.147098653s" podCreationTimestamp="2025-11-26 16:03:43 +0000 UTC" firstStartedPulling="2025-11-26 16:03:45.915341436 +0000 UTC m=+9194.594543244" lastFinishedPulling="2025-11-26 16:03:47.46792466 +0000 UTC m=+9196.147126478" observedRunningTime="2025-11-26 16:03:53.139568671 +0000 UTC m=+9201.818770499" watchObservedRunningTime="2025-11-26 16:03:53.147098653 +0000 UTC m=+9201.826300471" Nov 26 16:03:54 crc kubenswrapper[5200]: I1126 16:03:54.095723 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:54 crc kubenswrapper[5200]: I1126 16:03:54.095784 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:03:55 crc kubenswrapper[5200]: I1126 16:03:55.163600 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6gng8" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="registry-server" probeResult="failure" output=< Nov 26 16:03:55 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 16:03:55 crc kubenswrapper[5200]: > Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.162065 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xs4kq"] Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.177384 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xs4kq"] Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.177617 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.335330 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r7j4\" (UniqueName: \"kubernetes.io/projected/0e36e637-195c-4354-aa83-d9e4a725ccca-kube-api-access-8r7j4\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.335533 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-utilities\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.335589 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-catalog-content\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.437106 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-utilities\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.437191 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-catalog-content\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.437258 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r7j4\" (UniqueName: \"kubernetes.io/projected/0e36e637-195c-4354-aa83-d9e4a725ccca-kube-api-access-8r7j4\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.438189 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-utilities\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.438408 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-catalog-content\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.467468 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r7j4\" (UniqueName: \"kubernetes.io/projected/0e36e637-195c-4354-aa83-d9e4a725ccca-kube-api-access-8r7j4\") pod \"community-operators-xs4kq\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:02 crc kubenswrapper[5200]: I1126 16:04:02.507495 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:03 crc kubenswrapper[5200]: I1126 16:04:03.029024 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xs4kq"] Nov 26 16:04:03 crc kubenswrapper[5200]: I1126 16:04:03.154084 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:04:03 crc kubenswrapper[5200]: I1126 16:04:03.154183 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:04:03 crc kubenswrapper[5200]: I1126 16:04:03.154291 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 16:04:03 crc kubenswrapper[5200]: I1126 16:04:03.155233 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 16:04:03 crc kubenswrapper[5200]: I1126 16:04:03.155314 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" gracePeriod=600 Nov 26 16:04:03 crc kubenswrapper[5200]: I1126 16:04:03.250491 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs4kq" event={"ID":"0e36e637-195c-4354-aa83-d9e4a725ccca","Type":"ContainerStarted","Data":"53ef56203e1735316a089e150caec25c9230a60f1c46c8d804f964bbc49ed24b"} Nov 26 16:04:03 crc kubenswrapper[5200]: E1126 16:04:03.296020 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:04:04 crc kubenswrapper[5200]: I1126 16:04:04.157479 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:04:04 crc kubenswrapper[5200]: I1126 16:04:04.218831 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:04:04 crc kubenswrapper[5200]: I1126 16:04:04.261132 5200 generic.go:358] "Generic (PLEG): container finished" podID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerID="540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb" exitCode=0 Nov 26 16:04:04 crc kubenswrapper[5200]: I1126 16:04:04.261281 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs4kq" event={"ID":"0e36e637-195c-4354-aa83-d9e4a725ccca","Type":"ContainerDied","Data":"540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb"} Nov 26 16:04:04 crc kubenswrapper[5200]: I1126 16:04:04.264101 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" exitCode=0 Nov 26 16:04:04 crc kubenswrapper[5200]: I1126 16:04:04.264179 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73"} Nov 26 16:04:04 crc kubenswrapper[5200]: I1126 16:04:04.264223 5200 scope.go:117] "RemoveContainer" containerID="b30f1781ca3e63f9c10a22308af73f98c879bc00933c8c661bd0c8a5a54f6c4a" Nov 26 16:04:04 crc kubenswrapper[5200]: I1126 16:04:04.266000 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:04:04 crc kubenswrapper[5200]: E1126 16:04:04.266499 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:04:06 crc kubenswrapper[5200]: I1126 16:04:06.298063 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs4kq" event={"ID":"0e36e637-195c-4354-aa83-d9e4a725ccca","Type":"ContainerStarted","Data":"f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110"} Nov 26 16:04:06 crc kubenswrapper[5200]: I1126 16:04:06.535220 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6gng8"] Nov 26 16:04:06 crc kubenswrapper[5200]: I1126 16:04:06.535760 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6gng8" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="registry-server" containerID="cri-o://40aae5f486fbd2449b04c2444de0d6a2e59c41bf179497c394e47eda9df1ad56" gracePeriod=2 Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.312585 5200 generic.go:358] "Generic (PLEG): container finished" podID="d05ff412-12a3-44e4-af51-30771eb833ef" containerID="40aae5f486fbd2449b04c2444de0d6a2e59c41bf179497c394e47eda9df1ad56" exitCode=0 Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.312663 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gng8" event={"ID":"d05ff412-12a3-44e4-af51-30771eb833ef","Type":"ContainerDied","Data":"40aae5f486fbd2449b04c2444de0d6a2e59c41bf179497c394e47eda9df1ad56"} Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.312976 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gng8" event={"ID":"d05ff412-12a3-44e4-af51-30771eb833ef","Type":"ContainerDied","Data":"98b76df0f604473483e0dcae80793c7f78a7df47ec533ce45483dfc1e4946fe1"} Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.312990 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98b76df0f604473483e0dcae80793c7f78a7df47ec533ce45483dfc1e4946fe1" Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.317480 5200 generic.go:358] "Generic (PLEG): container finished" podID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerID="f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110" exitCode=0 Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.317594 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs4kq" event={"ID":"0e36e637-195c-4354-aa83-d9e4a725ccca","Type":"ContainerDied","Data":"f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110"} Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.337184 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.410035 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-utilities\") pod \"d05ff412-12a3-44e4-af51-30771eb833ef\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.410090 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h999h\" (UniqueName: \"kubernetes.io/projected/d05ff412-12a3-44e4-af51-30771eb833ef-kube-api-access-h999h\") pod \"d05ff412-12a3-44e4-af51-30771eb833ef\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.410271 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-catalog-content\") pod \"d05ff412-12a3-44e4-af51-30771eb833ef\" (UID: \"d05ff412-12a3-44e4-af51-30771eb833ef\") " Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.411246 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-utilities" (OuterVolumeSpecName: "utilities") pod "d05ff412-12a3-44e4-af51-30771eb833ef" (UID: "d05ff412-12a3-44e4-af51-30771eb833ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.419766 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05ff412-12a3-44e4-af51-30771eb833ef-kube-api-access-h999h" (OuterVolumeSpecName: "kube-api-access-h999h") pod "d05ff412-12a3-44e4-af51-30771eb833ef" (UID: "d05ff412-12a3-44e4-af51-30771eb833ef"). InnerVolumeSpecName "kube-api-access-h999h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.481902 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d05ff412-12a3-44e4-af51-30771eb833ef" (UID: "d05ff412-12a3-44e4-af51-30771eb833ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.512754 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.512786 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d05ff412-12a3-44e4-af51-30771eb833ef-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:07 crc kubenswrapper[5200]: I1126 16:04:07.512796 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h999h\" (UniqueName: \"kubernetes.io/projected/d05ff412-12a3-44e4-af51-30771eb833ef-kube-api-access-h999h\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:08 crc kubenswrapper[5200]: I1126 16:04:08.331799 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs4kq" event={"ID":"0e36e637-195c-4354-aa83-d9e4a725ccca","Type":"ContainerStarted","Data":"13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395"} Nov 26 16:04:08 crc kubenswrapper[5200]: I1126 16:04:08.332590 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gng8" Nov 26 16:04:08 crc kubenswrapper[5200]: I1126 16:04:08.357836 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xs4kq" podStartSLOduration=5.307148598 podStartE2EDuration="6.357811173s" podCreationTimestamp="2025-11-26 16:04:02 +0000 UTC" firstStartedPulling="2025-11-26 16:04:04.262413281 +0000 UTC m=+9212.941615099" lastFinishedPulling="2025-11-26 16:04:05.313075856 +0000 UTC m=+9213.992277674" observedRunningTime="2025-11-26 16:04:08.354905325 +0000 UTC m=+9217.034107163" watchObservedRunningTime="2025-11-26 16:04:08.357811173 +0000 UTC m=+9217.037012991" Nov 26 16:04:08 crc kubenswrapper[5200]: I1126 16:04:08.379406 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6gng8"] Nov 26 16:04:08 crc kubenswrapper[5200]: I1126 16:04:08.391315 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6gng8"] Nov 26 16:04:08 crc kubenswrapper[5200]: I1126 16:04:08.981329 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" path="/var/lib/kubelet/pods/d05ff412-12a3-44e4-af51-30771eb833ef/volumes" Nov 26 16:04:12 crc kubenswrapper[5200]: I1126 16:04:12.508053 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:12 crc kubenswrapper[5200]: I1126 16:04:12.508732 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:12 crc kubenswrapper[5200]: I1126 16:04:12.564306 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:13 crc kubenswrapper[5200]: I1126 16:04:13.471369 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:13 crc kubenswrapper[5200]: I1126 16:04:13.739849 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xs4kq"] Nov 26 16:04:15 crc kubenswrapper[5200]: I1126 16:04:15.445783 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xs4kq" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerName="registry-server" containerID="cri-o://13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395" gracePeriod=2 Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.090304 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.121047 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-catalog-content\") pod \"0e36e637-195c-4354-aa83-d9e4a725ccca\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.121208 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r7j4\" (UniqueName: \"kubernetes.io/projected/0e36e637-195c-4354-aa83-d9e4a725ccca-kube-api-access-8r7j4\") pod \"0e36e637-195c-4354-aa83-d9e4a725ccca\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.121403 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-utilities\") pod \"0e36e637-195c-4354-aa83-d9e4a725ccca\" (UID: \"0e36e637-195c-4354-aa83-d9e4a725ccca\") " Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.122989 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-utilities" (OuterVolumeSpecName: "utilities") pod "0e36e637-195c-4354-aa83-d9e4a725ccca" (UID: "0e36e637-195c-4354-aa83-d9e4a725ccca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.132876 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e36e637-195c-4354-aa83-d9e4a725ccca-kube-api-access-8r7j4" (OuterVolumeSpecName: "kube-api-access-8r7j4") pod "0e36e637-195c-4354-aa83-d9e4a725ccca" (UID: "0e36e637-195c-4354-aa83-d9e4a725ccca"). InnerVolumeSpecName "kube-api-access-8r7j4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.209581 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e36e637-195c-4354-aa83-d9e4a725ccca" (UID: "0e36e637-195c-4354-aa83-d9e4a725ccca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.226753 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.226797 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e36e637-195c-4354-aa83-d9e4a725ccca-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.226822 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8r7j4\" (UniqueName: \"kubernetes.io/projected/0e36e637-195c-4354-aa83-d9e4a725ccca-kube-api-access-8r7j4\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.482393 5200 generic.go:358] "Generic (PLEG): container finished" podID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerID="13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395" exitCode=0 Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.483087 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs4kq" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.485733 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs4kq" event={"ID":"0e36e637-195c-4354-aa83-d9e4a725ccca","Type":"ContainerDied","Data":"13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395"} Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.485806 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs4kq" event={"ID":"0e36e637-195c-4354-aa83-d9e4a725ccca","Type":"ContainerDied","Data":"53ef56203e1735316a089e150caec25c9230a60f1c46c8d804f964bbc49ed24b"} Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.485844 5200 scope.go:117] "RemoveContainer" containerID="13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.516151 5200 scope.go:117] "RemoveContainer" containerID="f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.546623 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xs4kq"] Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.557209 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xs4kq"] Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.572027 5200 scope.go:117] "RemoveContainer" containerID="540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.606495 5200 scope.go:117] "RemoveContainer" containerID="13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395" Nov 26 16:04:16 crc kubenswrapper[5200]: E1126 16:04:16.606886 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395\": container with ID starting with 13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395 not found: ID does not exist" containerID="13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.606918 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395"} err="failed to get container status \"13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395\": rpc error: code = NotFound desc = could not find container \"13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395\": container with ID starting with 13143538f7ebed016d400144f55b092bece2c4583c419e93f657c62e773ca395 not found: ID does not exist" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.606940 5200 scope.go:117] "RemoveContainer" containerID="f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110" Nov 26 16:04:16 crc kubenswrapper[5200]: E1126 16:04:16.607321 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110\": container with ID starting with f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110 not found: ID does not exist" containerID="f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.607343 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110"} err="failed to get container status \"f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110\": rpc error: code = NotFound desc = could not find container \"f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110\": container with ID starting with f2f5e766627b883f164a4b56905dac39ee5478fc95cfc9aef880873a4ee5a110 not found: ID does not exist" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.607356 5200 scope.go:117] "RemoveContainer" containerID="540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb" Nov 26 16:04:16 crc kubenswrapper[5200]: E1126 16:04:16.607628 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb\": container with ID starting with 540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb not found: ID does not exist" containerID="540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb" Nov 26 16:04:16 crc kubenswrapper[5200]: I1126 16:04:16.607719 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb"} err="failed to get container status \"540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb\": rpc error: code = NotFound desc = could not find container \"540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb\": container with ID starting with 540613d2a577bf625e0764417bda0ab5472f8a552a4621fa34903ed6ed6a38eb not found: ID does not exist" Nov 26 16:04:17 crc kubenswrapper[5200]: I1126 16:04:17.002601 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" path="/var/lib/kubelet/pods/0e36e637-195c-4354-aa83-d9e4a725ccca/volumes" Nov 26 16:04:19 crc kubenswrapper[5200]: I1126 16:04:19.969907 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:04:19 crc kubenswrapper[5200]: E1126 16:04:19.970440 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:04:30 crc kubenswrapper[5200]: I1126 16:04:30.969562 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:04:30 crc kubenswrapper[5200]: E1126 16:04:30.970788 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:04:42 crc kubenswrapper[5200]: E1126 16:04:42.720321 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2441799 actualBytes=10240 Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.947613 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pmpz7"] Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950309 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="extract-utilities" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950345 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="extract-utilities" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950394 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerName="registry-server" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950406 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerName="registry-server" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950871 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="registry-server" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950895 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="registry-server" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950944 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerName="extract-utilities" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950956 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerName="extract-utilities" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950987 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="extract-content" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.950998 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="extract-content" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.951092 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerName="extract-content" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.951104 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerName="extract-content" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.951512 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e36e637-195c-4354-aa83-d9e4a725ccca" containerName="registry-server" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.951539 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="d05ff412-12a3-44e4-af51-30771eb833ef" containerName="registry-server" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.964522 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmpz7"] Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.964750 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:42 crc kubenswrapper[5200]: I1126 16:04:42.970084 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:04:42 crc kubenswrapper[5200]: E1126 16:04:42.970972 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.001245 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxqsp\" (UniqueName: \"kubernetes.io/projected/63012475-9830-4864-85a3-2cc6221279ce-kube-api-access-gxqsp\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.001900 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-catalog-content\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.002039 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-utilities\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.104791 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-catalog-content\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.104858 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-utilities\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.104933 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxqsp\" (UniqueName: \"kubernetes.io/projected/63012475-9830-4864-85a3-2cc6221279ce-kube-api-access-gxqsp\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.105247 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-catalog-content\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.105340 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-utilities\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.128832 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxqsp\" (UniqueName: \"kubernetes.io/projected/63012475-9830-4864-85a3-2cc6221279ce-kube-api-access-gxqsp\") pod \"redhat-marketplace-pmpz7\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.291791 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.791107 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmpz7"] Nov 26 16:04:43 crc kubenswrapper[5200]: W1126 16:04:43.800071 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63012475_9830_4864_85a3_2cc6221279ce.slice/crio-c8c172eaabaebd18c9f1947884e63569bfa99a58bead23515798269f7a0fdb83 WatchSource:0}: Error finding container c8c172eaabaebd18c9f1947884e63569bfa99a58bead23515798269f7a0fdb83: Status 404 returned error can't find the container with id c8c172eaabaebd18c9f1947884e63569bfa99a58bead23515798269f7a0fdb83 Nov 26 16:04:43 crc kubenswrapper[5200]: I1126 16:04:43.840392 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmpz7" event={"ID":"63012475-9830-4864-85a3-2cc6221279ce","Type":"ContainerStarted","Data":"c8c172eaabaebd18c9f1947884e63569bfa99a58bead23515798269f7a0fdb83"} Nov 26 16:04:44 crc kubenswrapper[5200]: I1126 16:04:44.857726 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmpz7" event={"ID":"63012475-9830-4864-85a3-2cc6221279ce","Type":"ContainerDied","Data":"35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c"} Nov 26 16:04:44 crc kubenswrapper[5200]: I1126 16:04:44.857778 5200 generic.go:358] "Generic (PLEG): container finished" podID="63012475-9830-4864-85a3-2cc6221279ce" containerID="35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c" exitCode=0 Nov 26 16:04:47 crc kubenswrapper[5200]: I1126 16:04:47.890477 5200 generic.go:358] "Generic (PLEG): container finished" podID="63012475-9830-4864-85a3-2cc6221279ce" containerID="1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0" exitCode=0 Nov 26 16:04:47 crc kubenswrapper[5200]: I1126 16:04:47.890622 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmpz7" event={"ID":"63012475-9830-4864-85a3-2cc6221279ce","Type":"ContainerDied","Data":"1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0"} Nov 26 16:04:48 crc kubenswrapper[5200]: I1126 16:04:48.904672 5200 generic.go:358] "Generic (PLEG): container finished" podID="1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" containerID="a4f4c8e37b7ef2a75f21c29686ad725ed2fcd579830200ae27a03a37dd90279e" exitCode=0 Nov 26 16:04:48 crc kubenswrapper[5200]: I1126 16:04:48.904731 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" event={"ID":"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94","Type":"ContainerDied","Data":"a4f4c8e37b7ef2a75f21c29686ad725ed2fcd579830200ae27a03a37dd90279e"} Nov 26 16:04:48 crc kubenswrapper[5200]: I1126 16:04:48.911243 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmpz7" event={"ID":"63012475-9830-4864-85a3-2cc6221279ce","Type":"ContainerStarted","Data":"e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6"} Nov 26 16:04:48 crc kubenswrapper[5200]: I1126 16:04:48.947607 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pmpz7" podStartSLOduration=5.083623695 podStartE2EDuration="6.947585669s" podCreationTimestamp="2025-11-26 16:04:42 +0000 UTC" firstStartedPulling="2025-11-26 16:04:44.859517623 +0000 UTC m=+9253.538719471" lastFinishedPulling="2025-11-26 16:04:46.723479627 +0000 UTC m=+9255.402681445" observedRunningTime="2025-11-26 16:04:48.941395703 +0000 UTC m=+9257.620597541" watchObservedRunningTime="2025-11-26 16:04:48.947585669 +0000 UTC m=+9257.626787497" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.477054 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.604537 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-agent-neutron-config-0\") pod \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.604701 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ssh-key\") pod \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.604790 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47lzx\" (UniqueName: \"kubernetes.io/projected/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-kube-api-access-47lzx\") pod \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.605035 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-combined-ca-bundle\") pod \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.605428 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ceph\") pod \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.605607 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-inventory\") pod \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\" (UID: \"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94\") " Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.610459 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-kube-api-access-47lzx" (OuterVolumeSpecName: "kube-api-access-47lzx") pod "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" (UID: "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94"). InnerVolumeSpecName "kube-api-access-47lzx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.614734 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ceph" (OuterVolumeSpecName: "ceph") pod "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" (UID: "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.614767 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" (UID: "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.639933 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" (UID: "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.640813 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" (UID: "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.662123 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-inventory" (OuterVolumeSpecName: "inventory") pod "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" (UID: "1df5f5ba-1852-442d-9f4a-6d79ffa0ae94"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.709581 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.709639 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.709665 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.709705 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.709724 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47lzx\" (UniqueName: \"kubernetes.io/projected/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-kube-api-access-47lzx\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.709744 5200 reconciler_common.go:299] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5f5ba-1852-442d-9f4a-6d79ffa0ae94-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.947477 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.949449 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-5fjf9" event={"ID":"1df5f5ba-1852-442d-9f4a-6d79ffa0ae94","Type":"ContainerDied","Data":"c402bdbc1041b798587d42b5425dd3a54d9c9089d140de97d03f1feaeab4d923"} Nov 26 16:04:50 crc kubenswrapper[5200]: I1126 16:04:50.949625 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c402bdbc1041b798587d42b5425dd3a54d9c9089d140de97d03f1feaeab4d923" Nov 26 16:04:53 crc kubenswrapper[5200]: I1126 16:04:53.294118 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:53 crc kubenswrapper[5200]: I1126 16:04:53.294772 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:53 crc kubenswrapper[5200]: I1126 16:04:53.355333 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:54 crc kubenswrapper[5200]: I1126 16:04:54.087234 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:54 crc kubenswrapper[5200]: I1126 16:04:54.144321 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmpz7"] Nov 26 16:04:54 crc kubenswrapper[5200]: I1126 16:04:54.969353 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:04:54 crc kubenswrapper[5200]: E1126 16:04:54.969820 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.027758 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pmpz7" podUID="63012475-9830-4864-85a3-2cc6221279ce" containerName="registry-server" containerID="cri-o://e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6" gracePeriod=2 Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.666537 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.778888 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-catalog-content\") pod \"63012475-9830-4864-85a3-2cc6221279ce\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.779133 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxqsp\" (UniqueName: \"kubernetes.io/projected/63012475-9830-4864-85a3-2cc6221279ce-kube-api-access-gxqsp\") pod \"63012475-9830-4864-85a3-2cc6221279ce\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.779335 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-utilities\") pod \"63012475-9830-4864-85a3-2cc6221279ce\" (UID: \"63012475-9830-4864-85a3-2cc6221279ce\") " Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.782377 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-utilities" (OuterVolumeSpecName: "utilities") pod "63012475-9830-4864-85a3-2cc6221279ce" (UID: "63012475-9830-4864-85a3-2cc6221279ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.788961 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63012475-9830-4864-85a3-2cc6221279ce-kube-api-access-gxqsp" (OuterVolumeSpecName: "kube-api-access-gxqsp") pod "63012475-9830-4864-85a3-2cc6221279ce" (UID: "63012475-9830-4864-85a3-2cc6221279ce"). InnerVolumeSpecName "kube-api-access-gxqsp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.804428 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63012475-9830-4864-85a3-2cc6221279ce" (UID: "63012475-9830-4864-85a3-2cc6221279ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.882601 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.882651 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxqsp\" (UniqueName: \"kubernetes.io/projected/63012475-9830-4864-85a3-2cc6221279ce-kube-api-access-gxqsp\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:56 crc kubenswrapper[5200]: I1126 16:04:56.882671 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63012475-9830-4864-85a3-2cc6221279ce-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.063002 5200 generic.go:358] "Generic (PLEG): container finished" podID="63012475-9830-4864-85a3-2cc6221279ce" containerID="e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6" exitCode=0 Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.063086 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmpz7" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.063100 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmpz7" event={"ID":"63012475-9830-4864-85a3-2cc6221279ce","Type":"ContainerDied","Data":"e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6"} Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.063657 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmpz7" event={"ID":"63012475-9830-4864-85a3-2cc6221279ce","Type":"ContainerDied","Data":"c8c172eaabaebd18c9f1947884e63569bfa99a58bead23515798269f7a0fdb83"} Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.063679 5200 scope.go:117] "RemoveContainer" containerID="e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.092716 5200 scope.go:117] "RemoveContainer" containerID="1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.100521 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmpz7"] Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.111298 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmpz7"] Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.129928 5200 scope.go:117] "RemoveContainer" containerID="35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.163796 5200 scope.go:117] "RemoveContainer" containerID="e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6" Nov 26 16:04:57 crc kubenswrapper[5200]: E1126 16:04:57.164284 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6\": container with ID starting with e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6 not found: ID does not exist" containerID="e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.164318 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6"} err="failed to get container status \"e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6\": rpc error: code = NotFound desc = could not find container \"e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6\": container with ID starting with e067feb456b0bbb9539b2cc526b4875c3ab0e548c6d305de8a2a4374d618f2c6 not found: ID does not exist" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.164337 5200 scope.go:117] "RemoveContainer" containerID="1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0" Nov 26 16:04:57 crc kubenswrapper[5200]: E1126 16:04:57.164716 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0\": container with ID starting with 1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0 not found: ID does not exist" containerID="1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.164776 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0"} err="failed to get container status \"1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0\": rpc error: code = NotFound desc = could not find container \"1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0\": container with ID starting with 1245820b2afc097df6e1ac64f9df652afaa05f1d160fb97bbae28d41c11dd1c0 not found: ID does not exist" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.164814 5200 scope.go:117] "RemoveContainer" containerID="35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c" Nov 26 16:04:57 crc kubenswrapper[5200]: E1126 16:04:57.165154 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c\": container with ID starting with 35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c not found: ID does not exist" containerID="35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c" Nov 26 16:04:57 crc kubenswrapper[5200]: I1126 16:04:57.165178 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c"} err="failed to get container status \"35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c\": rpc error: code = NotFound desc = could not find container \"35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c\": container with ID starting with 35cd8a12fd2a953d2e45127dde863934e6a1827b9ae7a6e954096c3780987b6c not found: ID does not exist" Nov 26 16:04:58 crc kubenswrapper[5200]: I1126 16:04:58.715962 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 16:04:58 crc kubenswrapper[5200]: I1126 16:04:58.716469 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="49ca5aba-a47e-4a54-97d1-bde52324fe21" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191" gracePeriod=30 Nov 26 16:04:58 crc kubenswrapper[5200]: I1126 16:04:58.985166 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63012475-9830-4864-85a3-2cc6221279ce" path="/var/lib/kubelet/pods/63012475-9830-4864-85a3-2cc6221279ce/volumes" Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.269986 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.270260 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="bd5d470d-750f-4f42-a88a-1518b5cb9e15" containerName="nova-cell1-conductor-conductor" containerID="cri-o://30d1ec1ca96a0dc77e03930adddb940ee98f9568c6d06b8e02de88c48bf70493" gracePeriod=30 Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.458335 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.458752 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-log" containerID="cri-o://77635436a0c57c13afcc0831d7957c372f2fb1e1998f6eabe000a747c9c668dc" gracePeriod=30 Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.458832 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-api" containerID="cri-o://c373a82adbc473c635db51e1820ba54fe6a8f7fc766776e3c21a96eae29830b0" gracePeriod=30 Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.473190 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.473879 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="97d1f719-18bb-48df-b324-d6037111809d" containerName="nova-scheduler-scheduler" containerID="cri-o://5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736" gracePeriod=30 Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.540063 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.540418 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-metadata" containerID="cri-o://a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22" gracePeriod=30 Nov 26 16:04:59 crc kubenswrapper[5200]: I1126 16:04:59.540893 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-log" containerID="cri-o://d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c" gracePeriod=30 Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.112983 5200 generic.go:358] "Generic (PLEG): container finished" podID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerID="77635436a0c57c13afcc0831d7957c372f2fb1e1998f6eabe000a747c9c668dc" exitCode=143 Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.114324 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9","Type":"ContainerDied","Data":"77635436a0c57c13afcc0831d7957c372f2fb1e1998f6eabe000a747c9c668dc"} Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.129312 5200 generic.go:358] "Generic (PLEG): container finished" podID="bd5d470d-750f-4f42-a88a-1518b5cb9e15" containerID="30d1ec1ca96a0dc77e03930adddb940ee98f9568c6d06b8e02de88c48bf70493" exitCode=0 Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.130250 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bd5d470d-750f-4f42-a88a-1518b5cb9e15","Type":"ContainerDied","Data":"30d1ec1ca96a0dc77e03930adddb940ee98f9568c6d06b8e02de88c48bf70493"} Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.133566 5200 generic.go:358] "Generic (PLEG): container finished" podID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerID="d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c" exitCode=143 Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.133980 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ea5bf46-9089-43cc-9a63-552c2dbc5347","Type":"ContainerDied","Data":"d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c"} Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.538227 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.671855 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vszjk\" (UniqueName: \"kubernetes.io/projected/bd5d470d-750f-4f42-a88a-1518b5cb9e15-kube-api-access-vszjk\") pod \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.672053 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-config-data\") pod \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.672222 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-combined-ca-bundle\") pod \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\" (UID: \"bd5d470d-750f-4f42-a88a-1518b5cb9e15\") " Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.677392 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5d470d-750f-4f42-a88a-1518b5cb9e15-kube-api-access-vszjk" (OuterVolumeSpecName: "kube-api-access-vszjk") pod "bd5d470d-750f-4f42-a88a-1518b5cb9e15" (UID: "bd5d470d-750f-4f42-a88a-1518b5cb9e15"). InnerVolumeSpecName "kube-api-access-vszjk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.700443 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd5d470d-750f-4f42-a88a-1518b5cb9e15" (UID: "bd5d470d-750f-4f42-a88a-1518b5cb9e15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.709088 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-config-data" (OuterVolumeSpecName: "config-data") pod "bd5d470d-750f-4f42-a88a-1518b5cb9e15" (UID: "bd5d470d-750f-4f42-a88a-1518b5cb9e15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.775162 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.775198 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vszjk\" (UniqueName: \"kubernetes.io/projected/bd5d470d-750f-4f42-a88a-1518b5cb9e15-kube-api-access-vszjk\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:00 crc kubenswrapper[5200]: I1126 16:05:00.775208 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5d470d-750f-4f42-a88a-1518b5cb9e15-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.151498 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bd5d470d-750f-4f42-a88a-1518b5cb9e15","Type":"ContainerDied","Data":"a11cf786e379be74a247082939308f3b9c9d99ee3231c4a49a515d8cfa1b722d"} Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.151522 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.151556 5200 scope.go:117] "RemoveContainer" containerID="30d1ec1ca96a0dc77e03930adddb940ee98f9568c6d06b8e02de88c48bf70493" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.188966 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.205938 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.214716 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216061 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63012475-9830-4864-85a3-2cc6221279ce" containerName="extract-content" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216081 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="63012475-9830-4864-85a3-2cc6221279ce" containerName="extract-content" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216096 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63012475-9830-4864-85a3-2cc6221279ce" containerName="registry-server" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216102 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="63012475-9830-4864-85a3-2cc6221279ce" containerName="registry-server" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216118 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd5d470d-750f-4f42-a88a-1518b5cb9e15" containerName="nova-cell1-conductor-conductor" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216125 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5d470d-750f-4f42-a88a-1518b5cb9e15" containerName="nova-cell1-conductor-conductor" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216157 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216162 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216205 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63012475-9830-4864-85a3-2cc6221279ce" containerName="extract-utilities" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216210 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="63012475-9830-4864-85a3-2cc6221279ce" containerName="extract-utilities" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216397 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd5d470d-750f-4f42-a88a-1518b5cb9e15" containerName="nova-cell1-conductor-conductor" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216407 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="63012475-9830-4864-85a3-2cc6221279ce" containerName="registry-server" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.216418 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="1df5f5ba-1852-442d-9f4a-6d79ffa0ae94" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.221893 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.224923 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-config-data\"" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.230818 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.388839 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffg77\" (UniqueName: \"kubernetes.io/projected/35b67423-290e-4d22-a345-db88e6c279cd-kube-api-access-ffg77\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.388978 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b67423-290e-4d22-a345-db88e6c279cd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.389050 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b67423-290e-4d22-a345-db88e6c279cd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.492041 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b67423-290e-4d22-a345-db88e6c279cd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.492223 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffg77\" (UniqueName: \"kubernetes.io/projected/35b67423-290e-4d22-a345-db88e6c279cd-kube-api-access-ffg77\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.492332 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b67423-290e-4d22-a345-db88e6c279cd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.500026 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b67423-290e-4d22-a345-db88e6c279cd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.504121 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b67423-290e-4d22-a345-db88e6c279cd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.508813 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffg77\" (UniqueName: \"kubernetes.io/projected/35b67423-290e-4d22-a345-db88e6c279cd-kube-api-access-ffg77\") pod \"nova-cell1-conductor-0\" (UID: \"35b67423-290e-4d22-a345-db88e6c279cd\") " pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:01 crc kubenswrapper[5200]: I1126 16:05:01.548953 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:02 crc kubenswrapper[5200]: I1126 16:05:02.082609 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 16:05:02 crc kubenswrapper[5200]: I1126 16:05:02.171963 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"35b67423-290e-4d22-a345-db88e6c279cd","Type":"ContainerStarted","Data":"dc99ce25392155dfafbaefa359158911fed6b2cba2b72d228af50ba089289f6d"} Nov 26 16:05:02 crc kubenswrapper[5200]: I1126 16:05:02.700869 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": read tcp 10.217.0.2:46694->10.217.1.82:8775: read: connection reset by peer" Nov 26 16:05:02 crc kubenswrapper[5200]: I1126 16:05:02.700870 5200 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": read tcp 10.217.0.2:46690->10.217.1.82:8775: read: connection reset by peer" Nov 26 16:05:02 crc kubenswrapper[5200]: I1126 16:05:02.993508 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5d470d-750f-4f42-a88a-1518b5cb9e15" path="/var/lib/kubelet/pods/bd5d470d-750f-4f42-a88a-1518b5cb9e15/volumes" Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.038523 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.044063 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.045662 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.045705 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="49ca5aba-a47e-4a54-97d1-bde52324fe21" containerName="nova-cell0-conductor-conductor" probeResult="unknown" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.065024 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.136650 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-combined-ca-bundle\") pod \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.136910 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-config-data\") pod \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.137049 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl8h8\" (UniqueName: \"kubernetes.io/projected/7ea5bf46-9089-43cc-9a63-552c2dbc5347-kube-api-access-sl8h8\") pod \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.137092 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea5bf46-9089-43cc-9a63-552c2dbc5347-logs\") pod \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\" (UID: \"7ea5bf46-9089-43cc-9a63-552c2dbc5347\") " Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.137918 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea5bf46-9089-43cc-9a63-552c2dbc5347-logs" (OuterVolumeSpecName: "logs") pod "7ea5bf46-9089-43cc-9a63-552c2dbc5347" (UID: "7ea5bf46-9089-43cc-9a63-552c2dbc5347"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.149789 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea5bf46-9089-43cc-9a63-552c2dbc5347-kube-api-access-sl8h8" (OuterVolumeSpecName: "kube-api-access-sl8h8") pod "7ea5bf46-9089-43cc-9a63-552c2dbc5347" (UID: "7ea5bf46-9089-43cc-9a63-552c2dbc5347"). InnerVolumeSpecName "kube-api-access-sl8h8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.157235 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.163940 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.168539 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-config-data" (OuterVolumeSpecName: "config-data") pod "7ea5bf46-9089-43cc-9a63-552c2dbc5347" (UID: "7ea5bf46-9089-43cc-9a63-552c2dbc5347"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.174271 5200 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.174377 5200 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="97d1f719-18bb-48df-b324-d6037111809d" containerName="nova-scheduler-scheduler" probeResult="unknown" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.185242 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"35b67423-290e-4d22-a345-db88e6c279cd","Type":"ContainerStarted","Data":"b9145cf2c40ad51736e63ded291a1f551b6981255403e4aff81f1243ed2e50cf"} Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.185822 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.191122 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ea5bf46-9089-43cc-9a63-552c2dbc5347" (UID: "7ea5bf46-9089-43cc-9a63-552c2dbc5347"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.192888 5200 generic.go:358] "Generic (PLEG): container finished" podID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerID="c373a82adbc473c635db51e1820ba54fe6a8f7fc766776e3c21a96eae29830b0" exitCode=0 Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.192942 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9","Type":"ContainerDied","Data":"c373a82adbc473c635db51e1820ba54fe6a8f7fc766776e3c21a96eae29830b0"} Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.195387 5200 generic.go:358] "Generic (PLEG): container finished" podID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerID="a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22" exitCode=0 Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.195533 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ea5bf46-9089-43cc-9a63-552c2dbc5347","Type":"ContainerDied","Data":"a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22"} Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.195552 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ea5bf46-9089-43cc-9a63-552c2dbc5347","Type":"ContainerDied","Data":"71c9760b2eb7ff89b4d1ecf75e8288684c311e5ceeaccd3799ff49fb19b3fe24"} Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.195568 5200 scope.go:117] "RemoveContainer" containerID="a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.195660 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.225429 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.225409694 podStartE2EDuration="2.225409694s" podCreationTimestamp="2025-11-26 16:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 16:05:03.203213698 +0000 UTC m=+9271.882415516" watchObservedRunningTime="2025-11-26 16:05:03.225409694 +0000 UTC m=+9271.904611512" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.241293 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.241323 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea5bf46-9089-43cc-9a63-552c2dbc5347-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.241332 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sl8h8\" (UniqueName: \"kubernetes.io/projected/7ea5bf46-9089-43cc-9a63-552c2dbc5347-kube-api-access-sl8h8\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.241341 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea5bf46-9089-43cc-9a63-552c2dbc5347-logs\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.325863 5200 scope.go:117] "RemoveContainer" containerID="d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.349475 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.353266 5200 scope.go:117] "RemoveContainer" containerID="a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22" Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.353909 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22\": container with ID starting with a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22 not found: ID does not exist" containerID="a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.354016 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22"} err="failed to get container status \"a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22\": rpc error: code = NotFound desc = could not find container \"a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22\": container with ID starting with a373416fac68f8d99f4f1838438373eb4d823c28bee6bc50cf5918fcce700e22 not found: ID does not exist" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.354051 5200 scope.go:117] "RemoveContainer" containerID="d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c" Nov 26 16:05:03 crc kubenswrapper[5200]: E1126 16:05:03.354541 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c\": container with ID starting with d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c not found: ID does not exist" containerID="d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.354580 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c"} err="failed to get container status \"d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c\": rpc error: code = NotFound desc = could not find container \"d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c\": container with ID starting with d133efbb8fd28c6f567404d31c04ba644dfd0cb2e71ce77ef1256d4456e69d3c not found: ID does not exist" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.362093 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.373077 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.374415 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-log" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.374512 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-log" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.374773 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-metadata" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.374846 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-metadata" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.375155 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-log" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.375226 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" containerName="nova-metadata-metadata" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.387471 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.387607 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.389381 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.431117 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.453725 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77xvw\" (UniqueName: \"kubernetes.io/projected/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-kube-api-access-77xvw\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.453871 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-config-data\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.453966 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.454020 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-logs\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.555235 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-logs\") pod \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.555420 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff7zz\" (UniqueName: \"kubernetes.io/projected/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-kube-api-access-ff7zz\") pod \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.555697 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-config-data\") pod \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.555804 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-logs" (OuterVolumeSpecName: "logs") pod "5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" (UID: "5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.555937 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-combined-ca-bundle\") pod \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\" (UID: \"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9\") " Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.556250 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77xvw\" (UniqueName: \"kubernetes.io/projected/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-kube-api-access-77xvw\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.556375 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-config-data\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.556478 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.556547 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-logs\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.556737 5200 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-logs\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.557201 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-logs\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.560316 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-config-data\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.560483 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.566019 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-kube-api-access-ff7zz" (OuterVolumeSpecName: "kube-api-access-ff7zz") pod "5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" (UID: "5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9"). InnerVolumeSpecName "kube-api-access-ff7zz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.571537 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77xvw\" (UniqueName: \"kubernetes.io/projected/fde119d8-8d31-4f78-b9dc-f3415b3a6f67-kube-api-access-77xvw\") pod \"nova-metadata-0\" (UID: \"fde119d8-8d31-4f78-b9dc-f3415b3a6f67\") " pod="openstack/nova-metadata-0" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.585877 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" (UID: "5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.601907 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-config-data" (OuterVolumeSpecName: "config-data") pod "5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" (UID: "5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.659065 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.659107 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ff7zz\" (UniqueName: \"kubernetes.io/projected/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-kube-api-access-ff7zz\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.659123 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:03 crc kubenswrapper[5200]: I1126 16:05:03.752637 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.211474 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.211485 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9","Type":"ContainerDied","Data":"8eededf9eecc6b29afb08347c9ec616704844d74433e4d366a49369299092248"} Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.212752 5200 scope.go:117] "RemoveContainer" containerID="c373a82adbc473c635db51e1820ba54fe6a8f7fc766776e3c21a96eae29830b0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.257336 5200 scope.go:117] "RemoveContainer" containerID="77635436a0c57c13afcc0831d7957c372f2fb1e1998f6eabe000a747c9c668dc" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.268814 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.314658 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.335148 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.347159 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.348941 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-log" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.348962 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-log" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.348972 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-api" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.348978 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-api" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.349231 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-api" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.349257 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" containerName="nova-api-log" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.356231 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.358369 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.361134 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.477464 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deba3181-5f0e-459b-a975-541e140dd515-logs\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.477726 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deba3181-5f0e-459b-a975-541e140dd515-config-data\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.477904 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgmv\" (UniqueName: \"kubernetes.io/projected/deba3181-5f0e-459b-a975-541e140dd515-kube-api-access-gdgmv\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.478158 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba3181-5f0e-459b-a975-541e140dd515-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.580145 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deba3181-5f0e-459b-a975-541e140dd515-logs\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.580397 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deba3181-5f0e-459b-a975-541e140dd515-config-data\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.580491 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgmv\" (UniqueName: \"kubernetes.io/projected/deba3181-5f0e-459b-a975-541e140dd515-kube-api-access-gdgmv\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.580556 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba3181-5f0e-459b-a975-541e140dd515-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.583640 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deba3181-5f0e-459b-a975-541e140dd515-logs\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.600071 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba3181-5f0e-459b-a975-541e140dd515-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.600389 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deba3181-5f0e-459b-a975-541e140dd515-config-data\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.607522 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgmv\" (UniqueName: \"kubernetes.io/projected/deba3181-5f0e-459b-a975-541e140dd515-kube-api-access-gdgmv\") pod \"nova-api-0\" (UID: \"deba3181-5f0e-459b-a975-541e140dd515\") " pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.694170 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.723149 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.784347 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-config-data\") pod \"49ca5aba-a47e-4a54-97d1-bde52324fe21\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.784421 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-combined-ca-bundle\") pod \"49ca5aba-a47e-4a54-97d1-bde52324fe21\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.784818 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4cbs\" (UniqueName: \"kubernetes.io/projected/49ca5aba-a47e-4a54-97d1-bde52324fe21-kube-api-access-m4cbs\") pod \"49ca5aba-a47e-4a54-97d1-bde52324fe21\" (UID: \"49ca5aba-a47e-4a54-97d1-bde52324fe21\") " Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.789911 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ca5aba-a47e-4a54-97d1-bde52324fe21-kube-api-access-m4cbs" (OuterVolumeSpecName: "kube-api-access-m4cbs") pod "49ca5aba-a47e-4a54-97d1-bde52324fe21" (UID: "49ca5aba-a47e-4a54-97d1-bde52324fe21"). InnerVolumeSpecName "kube-api-access-m4cbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.840770 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49ca5aba-a47e-4a54-97d1-bde52324fe21" (UID: "49ca5aba-a47e-4a54-97d1-bde52324fe21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.840887 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-config-data" (OuterVolumeSpecName: "config-data") pod "49ca5aba-a47e-4a54-97d1-bde52324fe21" (UID: "49ca5aba-a47e-4a54-97d1-bde52324fe21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.889673 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.889717 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ca5aba-a47e-4a54-97d1-bde52324fe21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.889727 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m4cbs\" (UniqueName: \"kubernetes.io/projected/49ca5aba-a47e-4a54-97d1-bde52324fe21-kube-api-access-m4cbs\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.989529 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9" path="/var/lib/kubelet/pods/5325b0ca-3c7b-46a3-b73a-7d28e79a6fc9/volumes" Nov 26 16:05:04 crc kubenswrapper[5200]: I1126 16:05:04.990177 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea5bf46-9089-43cc-9a63-552c2dbc5347" path="/var/lib/kubelet/pods/7ea5bf46-9089-43cc-9a63-552c2dbc5347/volumes" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.226563 5200 generic.go:358] "Generic (PLEG): container finished" podID="49ca5aba-a47e-4a54-97d1-bde52324fe21" containerID="f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191" exitCode=0 Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.226929 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"49ca5aba-a47e-4a54-97d1-bde52324fe21","Type":"ContainerDied","Data":"f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191"} Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.227071 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"49ca5aba-a47e-4a54-97d1-bde52324fe21","Type":"ContainerDied","Data":"b1d476c4b03c41087fbe1ab4d5d1a9bfa9a8e449b2a41552625c38e0e42bf21d"} Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.227091 5200 scope.go:117] "RemoveContainer" containerID="f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.226952 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.230436 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fde119d8-8d31-4f78-b9dc-f3415b3a6f67","Type":"ContainerStarted","Data":"2ffc6ea9898744da52bf0d3b7d6e2f6800c2224eeb40544818e7772578207134"} Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.230456 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fde119d8-8d31-4f78-b9dc-f3415b3a6f67","Type":"ContainerStarted","Data":"0ec56ed207c961702869f33fa78ec0d81cf211ea0d56580c704b9a45599f87b7"} Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.230467 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fde119d8-8d31-4f78-b9dc-f3415b3a6f67","Type":"ContainerStarted","Data":"dd09b68e893f48d803c6c56845728168c78e70a1f4320b5d42e36a72695774f1"} Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.260614 5200 scope.go:117] "RemoveContainer" containerID="f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191" Nov 26 16:05:05 crc kubenswrapper[5200]: E1126 16:05:05.261101 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191\": container with ID starting with f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191 not found: ID does not exist" containerID="f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.261154 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191"} err="failed to get container status \"f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191\": rpc error: code = NotFound desc = could not find container \"f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191\": container with ID starting with f3ac13de8c8bb09aa7a5cd12d672759c187e6ddf4bfbed88fae6289cbf6b5191 not found: ID does not exist" Nov 26 16:05:05 crc kubenswrapper[5200]: W1126 16:05:05.268841 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeba3181_5f0e_459b_a975_541e140dd515.slice/crio-3804ec8d9c59e99389b59d7e96d63eef4bf92cca9bafc53aeda968ca8e8984c9 WatchSource:0}: Error finding container 3804ec8d9c59e99389b59d7e96d63eef4bf92cca9bafc53aeda968ca8e8984c9: Status 404 returned error can't find the container with id 3804ec8d9c59e99389b59d7e96d63eef4bf92cca9bafc53aeda968ca8e8984c9 Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.274427 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.284386 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.308044 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.318400 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.320045 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49ca5aba-a47e-4a54-97d1-bde52324fe21" containerName="nova-cell0-conductor-conductor" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.320073 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ca5aba-a47e-4a54-97d1-bde52324fe21" containerName="nova-cell0-conductor-conductor" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.320428 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="49ca5aba-a47e-4a54-97d1-bde52324fe21" containerName="nova-cell0-conductor-conductor" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.322532 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.322517309 podStartE2EDuration="2.322517309s" podCreationTimestamp="2025-11-26 16:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 16:05:05.263094046 +0000 UTC m=+9273.942295894" watchObservedRunningTime="2025-11-26 16:05:05.322517309 +0000 UTC m=+9274.001719127" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.328477 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.333095 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-config-data\"" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.335521 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.407517 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.407820 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbk7\" (UniqueName: \"kubernetes.io/projected/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-kube-api-access-vjbk7\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.407954 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.513239 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.513380 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbk7\" (UniqueName: \"kubernetes.io/projected/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-kube-api-access-vjbk7\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.513431 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.523376 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.525438 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.555253 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbk7\" (UniqueName: \"kubernetes.io/projected/a3e9d79c-5406-4d25-8ef3-376e4c88c0ea-kube-api-access-vjbk7\") pod \"nova-cell0-conductor-0\" (UID: \"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea\") " pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:05 crc kubenswrapper[5200]: I1126 16:05:05.722959 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.253130 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"deba3181-5f0e-459b-a975-541e140dd515","Type":"ContainerStarted","Data":"4687bec24fad0efe847acb8cd052fea6f226ef8d0c49fb55959777d8e60c3b59"} Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.255373 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"deba3181-5f0e-459b-a975-541e140dd515","Type":"ContainerStarted","Data":"047ac0574dc4af205e5326ff3b030049c2b9406b05ff21d838bd6f0a6e638388"} Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.255404 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"deba3181-5f0e-459b-a975-541e140dd515","Type":"ContainerStarted","Data":"3804ec8d9c59e99389b59d7e96d63eef4bf92cca9bafc53aeda968ca8e8984c9"} Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.263115 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.293565 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.293542288 podStartE2EDuration="2.293542288s" podCreationTimestamp="2025-11-26 16:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 16:05:06.272406642 +0000 UTC m=+9274.951608470" watchObservedRunningTime="2025-11-26 16:05:06.293542288 +0000 UTC m=+9274.972744116" Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.673837 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.741841 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-config-data\") pod \"97d1f719-18bb-48df-b324-d6037111809d\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.742296 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk7q7\" (UniqueName: \"kubernetes.io/projected/97d1f719-18bb-48df-b324-d6037111809d-kube-api-access-xk7q7\") pod \"97d1f719-18bb-48df-b324-d6037111809d\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.742568 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-combined-ca-bundle\") pod \"97d1f719-18bb-48df-b324-d6037111809d\" (UID: \"97d1f719-18bb-48df-b324-d6037111809d\") " Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.750583 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d1f719-18bb-48df-b324-d6037111809d-kube-api-access-xk7q7" (OuterVolumeSpecName: "kube-api-access-xk7q7") pod "97d1f719-18bb-48df-b324-d6037111809d" (UID: "97d1f719-18bb-48df-b324-d6037111809d"). InnerVolumeSpecName "kube-api-access-xk7q7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.785873 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-config-data" (OuterVolumeSpecName: "config-data") pod "97d1f719-18bb-48df-b324-d6037111809d" (UID: "97d1f719-18bb-48df-b324-d6037111809d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.791822 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97d1f719-18bb-48df-b324-d6037111809d" (UID: "97d1f719-18bb-48df-b324-d6037111809d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.849510 5200 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.849603 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xk7q7\" (UniqueName: \"kubernetes.io/projected/97d1f719-18bb-48df-b324-d6037111809d-kube-api-access-xk7q7\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:06 crc kubenswrapper[5200]: I1126 16:05:06.849625 5200 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1f719-18bb-48df-b324-d6037111809d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.018589 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ca5aba-a47e-4a54-97d1-bde52324fe21" path="/var/lib/kubelet/pods/49ca5aba-a47e-4a54-97d1-bde52324fe21/volumes" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.263790 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea","Type":"ContainerStarted","Data":"4568e4eeae5640b326a90181f2de1851d1481a02ae2ffa4765eae8c6eb99d11a"} Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.263833 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3e9d79c-5406-4d25-8ef3-376e4c88c0ea","Type":"ContainerStarted","Data":"bb2c13dfaef80ae1f68cf87c2715b7dcdc3eba74913c837f164e0ff60ee324f9"} Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.265061 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.266401 5200 generic.go:358] "Generic (PLEG): container finished" podID="97d1f719-18bb-48df-b324-d6037111809d" containerID="5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736" exitCode=0 Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.267107 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"97d1f719-18bb-48df-b324-d6037111809d","Type":"ContainerDied","Data":"5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736"} Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.267117 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.267132 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"97d1f719-18bb-48df-b324-d6037111809d","Type":"ContainerDied","Data":"6a0823885d93e40061e26dd81b786ae9ec4c5d3de8c100df361d6db9edb4a9eb"} Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.267148 5200 scope.go:117] "RemoveContainer" containerID="5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.297711 5200 scope.go:117] "RemoveContainer" containerID="5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736" Nov 26 16:05:07 crc kubenswrapper[5200]: E1126 16:05:07.300590 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736\": container with ID starting with 5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736 not found: ID does not exist" containerID="5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.300738 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736"} err="failed to get container status \"5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736\": rpc error: code = NotFound desc = could not find container \"5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736\": container with ID starting with 5162e3c20f35ed24ddfcc43d1f421c6c9049847c1bcc6af0390a32b94e665736 not found: ID does not exist" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.305993 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.305968558 podStartE2EDuration="2.305968558s" podCreationTimestamp="2025-11-26 16:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 16:05:07.2851454 +0000 UTC m=+9275.964347238" watchObservedRunningTime="2025-11-26 16:05:07.305968558 +0000 UTC m=+9275.985170426" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.335341 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.353055 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.369429 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.370688 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97d1f719-18bb-48df-b324-d6037111809d" containerName="nova-scheduler-scheduler" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.370702 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d1f719-18bb-48df-b324-d6037111809d" containerName="nova-scheduler-scheduler" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.371026 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="97d1f719-18bb-48df-b324-d6037111809d" containerName="nova-scheduler-scheduler" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.382038 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.382226 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.386534 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.484754 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba34874-24d5-4073-8965-f81ac1846ea1-config-data\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.484851 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vf7f\" (UniqueName: \"kubernetes.io/projected/aba34874-24d5-4073-8965-f81ac1846ea1-kube-api-access-4vf7f\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.485210 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba34874-24d5-4073-8965-f81ac1846ea1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.588587 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba34874-24d5-4073-8965-f81ac1846ea1-config-data\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.588655 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vf7f\" (UniqueName: \"kubernetes.io/projected/aba34874-24d5-4073-8965-f81ac1846ea1-kube-api-access-4vf7f\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.588792 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba34874-24d5-4073-8965-f81ac1846ea1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.596012 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba34874-24d5-4073-8965-f81ac1846ea1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.603303 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba34874-24d5-4073-8965-f81ac1846ea1-config-data\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.612220 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vf7f\" (UniqueName: \"kubernetes.io/projected/aba34874-24d5-4073-8965-f81ac1846ea1-kube-api-access-4vf7f\") pod \"nova-scheduler-0\" (UID: \"aba34874-24d5-4073-8965-f81ac1846ea1\") " pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.707329 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 16:05:07 crc kubenswrapper[5200]: I1126 16:05:07.990265 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 16:05:08 crc kubenswrapper[5200]: I1126 16:05:08.277904 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aba34874-24d5-4073-8965-f81ac1846ea1","Type":"ContainerStarted","Data":"44483da2f732b7ee0e912755a2224767f9bb4dfb576f1417a819959269ad93a2"} Nov 26 16:05:08 crc kubenswrapper[5200]: I1126 16:05:08.752897 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 16:05:08 crc kubenswrapper[5200]: I1126 16:05:08.752952 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Nov 26 16:05:08 crc kubenswrapper[5200]: I1126 16:05:08.985069 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d1f719-18bb-48df-b324-d6037111809d" path="/var/lib/kubelet/pods/97d1f719-18bb-48df-b324-d6037111809d/volumes" Nov 26 16:05:09 crc kubenswrapper[5200]: I1126 16:05:09.262501 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 26 16:05:09 crc kubenswrapper[5200]: I1126 16:05:09.318870 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aba34874-24d5-4073-8965-f81ac1846ea1","Type":"ContainerStarted","Data":"34aad5406cba33f3d5cff70972a32251516f738dfcc36bc3a3c3bc3e156f2361"} Nov 26 16:05:09 crc kubenswrapper[5200]: I1126 16:05:09.346776 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.3467553629999998 podStartE2EDuration="2.346755363s" podCreationTimestamp="2025-11-26 16:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 16:05:09.332148442 +0000 UTC m=+9278.011350270" watchObservedRunningTime="2025-11-26 16:05:09.346755363 +0000 UTC m=+9278.025957181" Nov 26 16:05:09 crc kubenswrapper[5200]: I1126 16:05:09.973241 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:05:09 crc kubenswrapper[5200]: E1126 16:05:09.973720 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:05:12 crc kubenswrapper[5200]: I1126 16:05:12.707743 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Nov 26 16:05:13 crc kubenswrapper[5200]: I1126 16:05:13.753432 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 16:05:13 crc kubenswrapper[5200]: I1126 16:05:13.753946 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 16:05:14 crc kubenswrapper[5200]: I1126 16:05:14.355448 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 26 16:05:14 crc kubenswrapper[5200]: I1126 16:05:14.724490 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 16:05:14 crc kubenswrapper[5200]: I1126 16:05:14.724525 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 16:05:14 crc kubenswrapper[5200]: I1126 16:05:14.794873 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fde119d8-8d31-4f78-b9dc-f3415b3a6f67" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.194:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 16:05:14 crc kubenswrapper[5200]: I1126 16:05:14.794940 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fde119d8-8d31-4f78-b9dc-f3415b3a6f67" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.194:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 16:05:15 crc kubenswrapper[5200]: I1126 16:05:15.765919 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="deba3181-5f0e-459b-a975-541e140dd515" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 16:05:15 crc kubenswrapper[5200]: I1126 16:05:15.766801 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="deba3181-5f0e-459b-a975-541e140dd515" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 16:05:17 crc kubenswrapper[5200]: I1126 16:05:17.707732 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 16:05:17 crc kubenswrapper[5200]: I1126 16:05:17.747679 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 16:05:18 crc kubenswrapper[5200]: I1126 16:05:18.461523 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 16:05:21 crc kubenswrapper[5200]: I1126 16:05:21.970935 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:05:21 crc kubenswrapper[5200]: E1126 16:05:21.972007 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:05:23 crc kubenswrapper[5200]: I1126 16:05:23.756060 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 16:05:23 crc kubenswrapper[5200]: I1126 16:05:23.756744 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 16:05:23 crc kubenswrapper[5200]: I1126 16:05:23.759353 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 16:05:23 crc kubenswrapper[5200]: I1126 16:05:23.760817 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 16:05:24 crc kubenswrapper[5200]: I1126 16:05:24.729021 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 16:05:24 crc kubenswrapper[5200]: I1126 16:05:24.729554 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 16:05:24 crc kubenswrapper[5200]: I1126 16:05:24.731938 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 16:05:24 crc kubenswrapper[5200]: I1126 16:05:24.734020 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 16:05:25 crc kubenswrapper[5200]: I1126 16:05:25.508750 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Nov 26 16:05:25 crc kubenswrapper[5200]: I1126 16:05:25.513633 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.488491 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n"] Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.501266 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.504666 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-migration-ssh-key\"" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.505262 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.505357 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-dockercfg-qg24f\"" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.505470 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-cell1\"" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.505601 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"nova-cells-global-config\"" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.505662 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-adoption-secret\"" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.505679 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-compute-config\"" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.518584 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n"] Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.613905 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.614193 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7zvq\" (UniqueName: \"kubernetes.io/projected/4308e63d-232d-4d73-a202-5bb6000bac9c-kube-api-access-r7zvq\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.614297 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.614433 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.614539 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.614845 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.614922 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.614989 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.615094 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.615147 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.615255 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.717643 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.717718 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.717770 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.717804 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.717850 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.717925 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.717948 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7zvq\" (UniqueName: \"kubernetes.io/projected/4308e63d-232d-4d73-a202-5bb6000bac9c-kube-api-access-r7zvq\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.717966 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.718005 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.718032 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.718116 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.719070 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.719277 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.725321 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.729616 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.730423 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.730451 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.731946 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.732228 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.732304 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.733072 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.734622 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7zvq\" (UniqueName: \"kubernetes.io/projected/4308e63d-232d-4d73-a202-5bb6000bac9c-kube-api-access-r7zvq\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:26 crc kubenswrapper[5200]: I1126 16:05:26.824250 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:05:27 crc kubenswrapper[5200]: I1126 16:05:27.431427 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n"] Nov 26 16:05:27 crc kubenswrapper[5200]: W1126 16:05:27.440800 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4308e63d_232d_4d73_a202_5bb6000bac9c.slice/crio-c8bd30e500ec172fe484e04618ecd32f8ce9324dbfcb2c9b1cdd1404cd8d79be WatchSource:0}: Error finding container c8bd30e500ec172fe484e04618ecd32f8ce9324dbfcb2c9b1cdd1404cd8d79be: Status 404 returned error can't find the container with id c8bd30e500ec172fe484e04618ecd32f8ce9324dbfcb2c9b1cdd1404cd8d79be Nov 26 16:05:27 crc kubenswrapper[5200]: I1126 16:05:27.528963 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" event={"ID":"4308e63d-232d-4d73-a202-5bb6000bac9c","Type":"ContainerStarted","Data":"c8bd30e500ec172fe484e04618ecd32f8ce9324dbfcb2c9b1cdd1404cd8d79be"} Nov 26 16:05:29 crc kubenswrapper[5200]: I1126 16:05:29.555350 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" event={"ID":"4308e63d-232d-4d73-a202-5bb6000bac9c","Type":"ContainerStarted","Data":"3d3f56c64c5ea5bb906e7048add3766766d6fb40bc67dee906e10aecba52326e"} Nov 26 16:05:29 crc kubenswrapper[5200]: I1126 16:05:29.586463 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" podStartSLOduration=2.225446523 podStartE2EDuration="3.58644145s" podCreationTimestamp="2025-11-26 16:05:26 +0000 UTC" firstStartedPulling="2025-11-26 16:05:27.444253196 +0000 UTC m=+9296.123455014" lastFinishedPulling="2025-11-26 16:05:28.805248093 +0000 UTC m=+9297.484449941" observedRunningTime="2025-11-26 16:05:29.579012751 +0000 UTC m=+9298.258214609" watchObservedRunningTime="2025-11-26 16:05:29.58644145 +0000 UTC m=+9298.265643268" Nov 26 16:05:32 crc kubenswrapper[5200]: I1126 16:05:32.984776 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:05:32 crc kubenswrapper[5200]: E1126 16:05:32.985852 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:05:39 crc kubenswrapper[5200]: I1126 16:05:39.487584 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:05:39 crc kubenswrapper[5200]: I1126 16:05:39.492460 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:05:39 crc kubenswrapper[5200]: I1126 16:05:39.494992 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:05:39 crc kubenswrapper[5200]: I1126 16:05:39.496490 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:05:42 crc kubenswrapper[5200]: E1126 16:05:42.967638 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2442028 actualBytes=10240 Nov 26 16:05:44 crc kubenswrapper[5200]: I1126 16:05:44.970537 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:05:44 crc kubenswrapper[5200]: E1126 16:05:44.972118 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:05:55 crc kubenswrapper[5200]: I1126 16:05:55.969822 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:05:55 crc kubenswrapper[5200]: E1126 16:05:55.970825 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:05:59 crc kubenswrapper[5200]: I1126 16:05:59.920335 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rn4c8"] Nov 26 16:05:59 crc kubenswrapper[5200]: I1126 16:05:59.939707 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rn4c8"] Nov 26 16:05:59 crc kubenswrapper[5200]: I1126 16:05:59.939850 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:05:59 crc kubenswrapper[5200]: I1126 16:05:59.983462 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-utilities\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:05:59 crc kubenswrapper[5200]: I1126 16:05:59.984137 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l86bf\" (UniqueName: \"kubernetes.io/projected/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-kube-api-access-l86bf\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:05:59 crc kubenswrapper[5200]: I1126 16:05:59.984484 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-catalog-content\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.087059 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-utilities\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.087192 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l86bf\" (UniqueName: \"kubernetes.io/projected/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-kube-api-access-l86bf\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.087359 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-catalog-content\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.087577 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-utilities\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.089231 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-catalog-content\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.121812 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l86bf\" (UniqueName: \"kubernetes.io/projected/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-kube-api-access-l86bf\") pod \"certified-operators-rn4c8\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.277349 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:00 crc kubenswrapper[5200]: W1126 16:06:00.745066 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcd0dd41_7e6f_48ce_af9c_f4a6710cefbd.slice/crio-6b068806435d85b6b15dd149ad58d26b04f3e122dc9936680c85ae21e255b971 WatchSource:0}: Error finding container 6b068806435d85b6b15dd149ad58d26b04f3e122dc9936680c85ae21e255b971: Status 404 returned error can't find the container with id 6b068806435d85b6b15dd149ad58d26b04f3e122dc9936680c85ae21e255b971 Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.746396 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rn4c8"] Nov 26 16:06:00 crc kubenswrapper[5200]: I1126 16:06:00.933285 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn4c8" event={"ID":"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd","Type":"ContainerStarted","Data":"6b068806435d85b6b15dd149ad58d26b04f3e122dc9936680c85ae21e255b971"} Nov 26 16:06:01 crc kubenswrapper[5200]: I1126 16:06:01.942756 5200 generic.go:358] "Generic (PLEG): container finished" podID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerID="881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90" exitCode=0 Nov 26 16:06:01 crc kubenswrapper[5200]: I1126 16:06:01.942852 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn4c8" event={"ID":"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd","Type":"ContainerDied","Data":"881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90"} Nov 26 16:06:03 crc kubenswrapper[5200]: I1126 16:06:03.971635 5200 generic.go:358] "Generic (PLEG): container finished" podID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerID="e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8" exitCode=0 Nov 26 16:06:03 crc kubenswrapper[5200]: I1126 16:06:03.971679 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn4c8" event={"ID":"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd","Type":"ContainerDied","Data":"e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8"} Nov 26 16:06:05 crc kubenswrapper[5200]: I1126 16:06:05.005025 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn4c8" event={"ID":"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd","Type":"ContainerStarted","Data":"25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3"} Nov 26 16:06:05 crc kubenswrapper[5200]: I1126 16:06:05.028241 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rn4c8" podStartSLOduration=5.06096781 podStartE2EDuration="6.028210708s" podCreationTimestamp="2025-11-26 16:05:59 +0000 UTC" firstStartedPulling="2025-11-26 16:06:01.944758361 +0000 UTC m=+9330.623960209" lastFinishedPulling="2025-11-26 16:06:02.912001289 +0000 UTC m=+9331.591203107" observedRunningTime="2025-11-26 16:06:05.026649306 +0000 UTC m=+9333.705851154" watchObservedRunningTime="2025-11-26 16:06:05.028210708 +0000 UTC m=+9333.707412536" Nov 26 16:06:10 crc kubenswrapper[5200]: I1126 16:06:10.278551 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:10 crc kubenswrapper[5200]: I1126 16:06:10.279075 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:10 crc kubenswrapper[5200]: I1126 16:06:10.333841 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:10 crc kubenswrapper[5200]: I1126 16:06:10.971380 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:06:10 crc kubenswrapper[5200]: E1126 16:06:10.972254 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:06:11 crc kubenswrapper[5200]: I1126 16:06:11.143496 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:11 crc kubenswrapper[5200]: I1126 16:06:11.215515 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rn4c8"] Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.102344 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rn4c8" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerName="registry-server" containerID="cri-o://25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3" gracePeriod=2 Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.700475 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.786774 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-utilities\") pod \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.786844 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-catalog-content\") pod \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.787027 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l86bf\" (UniqueName: \"kubernetes.io/projected/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-kube-api-access-l86bf\") pod \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\" (UID: \"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd\") " Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.788281 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-utilities" (OuterVolumeSpecName: "utilities") pod "fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" (UID: "fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.814073 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-kube-api-access-l86bf" (OuterVolumeSpecName: "kube-api-access-l86bf") pod "fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" (UID: "fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd"). InnerVolumeSpecName "kube-api-access-l86bf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.822228 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" (UID: "fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.890518 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l86bf\" (UniqueName: \"kubernetes.io/projected/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-kube-api-access-l86bf\") on node \"crc\" DevicePath \"\"" Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.890588 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:06:13 crc kubenswrapper[5200]: I1126 16:06:13.890608 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.123157 5200 generic.go:358] "Generic (PLEG): container finished" podID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerID="25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3" exitCode=0 Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.123276 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rn4c8" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.125151 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn4c8" event={"ID":"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd","Type":"ContainerDied","Data":"25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3"} Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.125335 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rn4c8" event={"ID":"fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd","Type":"ContainerDied","Data":"6b068806435d85b6b15dd149ad58d26b04f3e122dc9936680c85ae21e255b971"} Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.125454 5200 scope.go:117] "RemoveContainer" containerID="25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.161028 5200 scope.go:117] "RemoveContainer" containerID="e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.203421 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rn4c8"] Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.212897 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rn4c8"] Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.223949 5200 scope.go:117] "RemoveContainer" containerID="881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.285135 5200 scope.go:117] "RemoveContainer" containerID="25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3" Nov 26 16:06:14 crc kubenswrapper[5200]: E1126 16:06:14.286046 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3\": container with ID starting with 25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3 not found: ID does not exist" containerID="25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.286088 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3"} err="failed to get container status \"25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3\": rpc error: code = NotFound desc = could not find container \"25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3\": container with ID starting with 25a9b48ba95e053768035af8d019fd39055e9c730a2ba8076d728e2a336985f3 not found: ID does not exist" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.286113 5200 scope.go:117] "RemoveContainer" containerID="e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8" Nov 26 16:06:14 crc kubenswrapper[5200]: E1126 16:06:14.287808 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8\": container with ID starting with e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8 not found: ID does not exist" containerID="e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.287840 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8"} err="failed to get container status \"e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8\": rpc error: code = NotFound desc = could not find container \"e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8\": container with ID starting with e92aabaedd86fee6d51b0f5c16c2c132979090e062f6107ce1836970bab907d8 not found: ID does not exist" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.287859 5200 scope.go:117] "RemoveContainer" containerID="881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90" Nov 26 16:06:14 crc kubenswrapper[5200]: E1126 16:06:14.288338 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90\": container with ID starting with 881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90 not found: ID does not exist" containerID="881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.288369 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90"} err="failed to get container status \"881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90\": rpc error: code = NotFound desc = could not find container \"881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90\": container with ID starting with 881c7a366568240ee0d622c333c3e933437d2010cfdc2bf1fdc97b816f7bec90 not found: ID does not exist" Nov 26 16:06:14 crc kubenswrapper[5200]: I1126 16:06:14.991059 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" path="/var/lib/kubelet/pods/fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd/volumes" Nov 26 16:06:24 crc kubenswrapper[5200]: I1126 16:06:24.970609 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:06:24 crc kubenswrapper[5200]: E1126 16:06:24.974106 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:06:36 crc kubenswrapper[5200]: I1126 16:06:36.970518 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:06:36 crc kubenswrapper[5200]: E1126 16:06:36.971435 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:06:42 crc kubenswrapper[5200]: E1126 16:06:42.891331 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2442030 actualBytes=10240 Nov 26 16:06:50 crc kubenswrapper[5200]: I1126 16:06:50.970743 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:06:50 crc kubenswrapper[5200]: E1126 16:06:50.971528 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:07:01 crc kubenswrapper[5200]: I1126 16:07:01.970421 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:07:01 crc kubenswrapper[5200]: E1126 16:07:01.971967 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:07:15 crc kubenswrapper[5200]: I1126 16:07:15.970220 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:07:15 crc kubenswrapper[5200]: E1126 16:07:15.974565 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:07:30 crc kubenswrapper[5200]: I1126 16:07:30.969992 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:07:30 crc kubenswrapper[5200]: E1126 16:07:30.971421 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:07:41 crc kubenswrapper[5200]: I1126 16:07:41.969925 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:07:41 crc kubenswrapper[5200]: E1126 16:07:41.971061 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:07:42 crc kubenswrapper[5200]: E1126 16:07:42.788875 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2442028 actualBytes=10240 Nov 26 16:07:53 crc kubenswrapper[5200]: I1126 16:07:53.969245 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:07:53 crc kubenswrapper[5200]: E1126 16:07:53.970218 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:08:08 crc kubenswrapper[5200]: I1126 16:08:08.970047 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:08:08 crc kubenswrapper[5200]: E1126 16:08:08.971060 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:08:23 crc kubenswrapper[5200]: I1126 16:08:23.970033 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:08:23 crc kubenswrapper[5200]: E1126 16:08:23.970971 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:08:37 crc kubenswrapper[5200]: I1126 16:08:37.969666 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:08:37 crc kubenswrapper[5200]: E1126 16:08:37.970473 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:08:42 crc kubenswrapper[5200]: E1126 16:08:42.716797 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2442025 actualBytes=10240 Nov 26 16:08:50 crc kubenswrapper[5200]: I1126 16:08:50.970099 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:08:50 crc kubenswrapper[5200]: E1126 16:08:50.972738 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:09:05 crc kubenswrapper[5200]: I1126 16:09:05.970305 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:09:05 crc kubenswrapper[5200]: I1126 16:09:05.973066 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 16:09:07 crc kubenswrapper[5200]: I1126 16:09:06.999525 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"1b7c4bf53fa7c03168795e76121507c4f8b74156fda0d302b44e19ecc62c9c75"} Nov 26 16:09:42 crc kubenswrapper[5200]: E1126 16:09:42.773972 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443351 actualBytes=10240 Nov 26 16:10:36 crc kubenswrapper[5200]: I1126 16:10:36.220355 5200 scope.go:117] "RemoveContainer" containerID="ee4ab94fb06056bcb0884088822d344316aa30b99a14f24d657eba878c650a88" Nov 26 16:10:36 crc kubenswrapper[5200]: I1126 16:10:36.259126 5200 scope.go:117] "RemoveContainer" containerID="40aae5f486fbd2449b04c2444de0d6a2e59c41bf179497c394e47eda9df1ad56" Nov 26 16:10:36 crc kubenswrapper[5200]: I1126 16:10:36.301219 5200 scope.go:117] "RemoveContainer" containerID="e19f71eae3ae2b484da91be69325490e02b3bc2824a9381f0b9d3368122589d2" Nov 26 16:10:39 crc kubenswrapper[5200]: I1126 16:10:39.677840 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:10:39 crc kubenswrapper[5200]: I1126 16:10:39.679506 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:10:39 crc kubenswrapper[5200]: I1126 16:10:39.684645 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:10:39 crc kubenswrapper[5200]: I1126 16:10:39.684668 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:10:42 crc kubenswrapper[5200]: E1126 16:10:42.563048 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443344 actualBytes=10240 Nov 26 16:11:33 crc kubenswrapper[5200]: I1126 16:11:33.154886 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:11:33 crc kubenswrapper[5200]: I1126 16:11:33.155492 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:11:43 crc kubenswrapper[5200]: E1126 16:11:43.013976 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2443333 actualBytes=10240 Nov 26 16:12:03 crc kubenswrapper[5200]: I1126 16:12:03.154945 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:12:03 crc kubenswrapper[5200]: I1126 16:12:03.163508 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:12:30 crc kubenswrapper[5200]: I1126 16:12:30.270428 5200 generic.go:358] "Generic (PLEG): container finished" podID="4308e63d-232d-4d73-a202-5bb6000bac9c" containerID="3d3f56c64c5ea5bb906e7048add3766766d6fb40bc67dee906e10aecba52326e" exitCode=0 Nov 26 16:12:30 crc kubenswrapper[5200]: I1126 16:12:30.270517 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" event={"ID":"4308e63d-232d-4d73-a202-5bb6000bac9c","Type":"ContainerDied","Data":"3d3f56c64c5ea5bb906e7048add3766766d6fb40bc67dee906e10aecba52326e"} Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.949756 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.979417 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-0\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.980521 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-0\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.980606 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-1\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.980798 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ssh-key\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.981148 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ceph\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.981184 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-0\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.981225 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7zvq\" (UniqueName: \"kubernetes.io/projected/4308e63d-232d-4d73-a202-5bb6000bac9c-kube-api-access-r7zvq\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.981829 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-inventory\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.981973 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-1\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.982051 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-1\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.982171 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-combined-ca-bundle\") pod \"4308e63d-232d-4d73-a202-5bb6000bac9c\" (UID: \"4308e63d-232d-4d73-a202-5bb6000bac9c\") " Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.987571 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.990919 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ceph" (OuterVolumeSpecName: "ceph") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:12:31 crc kubenswrapper[5200]: I1126 16:12:31.994654 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4308e63d-232d-4d73-a202-5bb6000bac9c-kube-api-access-r7zvq" (OuterVolumeSpecName: "kube-api-access-r7zvq") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "kube-api-access-r7zvq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.024125 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.042278 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.044329 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.053280 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.057905 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.064594 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.077435 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-inventory" (OuterVolumeSpecName: "inventory") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085755 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085779 5200 reconciler_common.go:299] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085790 5200 reconciler_common.go:299] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-ceph\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085798 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085808 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r7zvq\" (UniqueName: \"kubernetes.io/projected/4308e63d-232d-4d73-a202-5bb6000bac9c-kube-api-access-r7zvq\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085816 5200 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-inventory\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085824 5200 reconciler_common.go:299] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085832 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085841 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.085850 5200 reconciler_common.go:299] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.087220 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "4308e63d-232d-4d73-a202-5bb6000bac9c" (UID: "4308e63d-232d-4d73-a202-5bb6000bac9c"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.188173 5200 reconciler_common.go:299] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/4308e63d-232d-4d73-a202-5bb6000bac9c-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.293719 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" event={"ID":"4308e63d-232d-4d73-a202-5bb6000bac9c","Type":"ContainerDied","Data":"c8bd30e500ec172fe484e04618ecd32f8ce9324dbfcb2c9b1cdd1404cd8d79be"} Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.293902 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n" Nov 26 16:12:32 crc kubenswrapper[5200]: I1126 16:12:32.294693 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8bd30e500ec172fe484e04618ecd32f8ce9324dbfcb2c9b1cdd1404cd8d79be" Nov 26 16:12:33 crc kubenswrapper[5200]: I1126 16:12:33.154151 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:12:33 crc kubenswrapper[5200]: I1126 16:12:33.154220 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:12:33 crc kubenswrapper[5200]: I1126 16:12:33.154265 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 16:12:33 crc kubenswrapper[5200]: I1126 16:12:33.154929 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b7c4bf53fa7c03168795e76121507c4f8b74156fda0d302b44e19ecc62c9c75"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 16:12:33 crc kubenswrapper[5200]: I1126 16:12:33.154991 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://1b7c4bf53fa7c03168795e76121507c4f8b74156fda0d302b44e19ecc62c9c75" gracePeriod=600 Nov 26 16:12:33 crc kubenswrapper[5200]: I1126 16:12:33.319375 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="1b7c4bf53fa7c03168795e76121507c4f8b74156fda0d302b44e19ecc62c9c75" exitCode=0 Nov 26 16:12:33 crc kubenswrapper[5200]: I1126 16:12:33.319569 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"1b7c4bf53fa7c03168795e76121507c4f8b74156fda0d302b44e19ecc62c9c75"} Nov 26 16:12:33 crc kubenswrapper[5200]: I1126 16:12:33.319824 5200 scope.go:117] "RemoveContainer" containerID="16c2ffcfc4ac3637ef566aa6d4b30778f65db3cf1c153337500f10facdac5c73" Nov 26 16:12:34 crc kubenswrapper[5200]: I1126 16:12:34.334300 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea"} Nov 26 16:12:42 crc kubenswrapper[5200]: E1126 16:12:42.780391 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2433712 actualBytes=10240 Nov 26 16:13:42 crc kubenswrapper[5200]: E1126 16:13:42.974698 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2433696 actualBytes=10240 Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.061624 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ck5jp"] Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.064875 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerName="extract-content" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.064904 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerName="extract-content" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.064978 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerName="extract-utilities" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.064992 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerName="extract-utilities" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.065020 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4308e63d-232d-4d73-a202-5bb6000bac9c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.065034 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="4308e63d-232d-4d73-a202-5bb6000bac9c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.065117 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerName="registry-server" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.065127 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerName="registry-server" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.065555 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="4308e63d-232d-4d73-a202-5bb6000bac9c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.065594 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcd0dd41-7e6f-48ce-af9c-f4a6710cefbd" containerName="registry-server" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.074650 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.090485 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ck5jp"] Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.168048 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-utilities\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.168114 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-catalog-content\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.168333 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fttg\" (UniqueName: \"kubernetes.io/projected/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-kube-api-access-9fttg\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.270408 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-utilities\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.270465 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-catalog-content\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.270637 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fttg\" (UniqueName: \"kubernetes.io/projected/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-kube-api-access-9fttg\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.271044 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-utilities\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.271333 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-catalog-content\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.299470 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fttg\" (UniqueName: \"kubernetes.io/projected/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-kube-api-access-9fttg\") pod \"redhat-operators-ck5jp\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.420823 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:48 crc kubenswrapper[5200]: I1126 16:13:48.908713 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ck5jp"] Nov 26 16:13:49 crc kubenswrapper[5200]: I1126 16:13:49.292211 5200 generic.go:358] "Generic (PLEG): container finished" podID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerID="150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9" exitCode=0 Nov 26 16:13:49 crc kubenswrapper[5200]: I1126 16:13:49.292293 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck5jp" event={"ID":"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8","Type":"ContainerDied","Data":"150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9"} Nov 26 16:13:49 crc kubenswrapper[5200]: I1126 16:13:49.292628 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck5jp" event={"ID":"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8","Type":"ContainerStarted","Data":"6c5c7f427fd9d8d93df087b471c89cec06fa423feee72b870c5fb2e0ddd17591"} Nov 26 16:13:52 crc kubenswrapper[5200]: I1126 16:13:52.340365 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck5jp" event={"ID":"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8","Type":"ContainerStarted","Data":"72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c"} Nov 26 16:13:55 crc kubenswrapper[5200]: I1126 16:13:55.378919 5200 generic.go:358] "Generic (PLEG): container finished" podID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerID="72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c" exitCode=0 Nov 26 16:13:55 crc kubenswrapper[5200]: I1126 16:13:55.378991 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck5jp" event={"ID":"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8","Type":"ContainerDied","Data":"72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c"} Nov 26 16:13:56 crc kubenswrapper[5200]: I1126 16:13:56.399928 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck5jp" event={"ID":"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8","Type":"ContainerStarted","Data":"1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1"} Nov 26 16:13:56 crc kubenswrapper[5200]: I1126 16:13:56.433475 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ck5jp" podStartSLOduration=6.356442638 podStartE2EDuration="8.43345598s" podCreationTimestamp="2025-11-26 16:13:48 +0000 UTC" firstStartedPulling="2025-11-26 16:13:49.293548009 +0000 UTC m=+9797.972749827" lastFinishedPulling="2025-11-26 16:13:51.370561341 +0000 UTC m=+9800.049763169" observedRunningTime="2025-11-26 16:13:56.427658365 +0000 UTC m=+9805.106860183" watchObservedRunningTime="2025-11-26 16:13:56.43345598 +0000 UTC m=+9805.112657798" Nov 26 16:13:58 crc kubenswrapper[5200]: I1126 16:13:58.421075 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:58 crc kubenswrapper[5200]: I1126 16:13:58.421926 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:13:59 crc kubenswrapper[5200]: I1126 16:13:59.484245 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ck5jp" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="registry-server" probeResult="failure" output=< Nov 26 16:13:59 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 16:13:59 crc kubenswrapper[5200]: > Nov 26 16:14:08 crc kubenswrapper[5200]: I1126 16:14:08.485930 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:14:08 crc kubenswrapper[5200]: I1126 16:14:08.555869 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:14:08 crc kubenswrapper[5200]: I1126 16:14:08.746871 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ck5jp"] Nov 26 16:14:09 crc kubenswrapper[5200]: I1126 16:14:09.554583 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ck5jp" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="registry-server" containerID="cri-o://1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1" gracePeriod=2 Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.135576 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.242571 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-utilities\") pod \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.242876 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fttg\" (UniqueName: \"kubernetes.io/projected/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-kube-api-access-9fttg\") pod \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.243422 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-catalog-content\") pod \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\" (UID: \"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8\") " Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.245266 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-utilities" (OuterVolumeSpecName: "utilities") pod "ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" (UID: "ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.258501 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-kube-api-access-9fttg" (OuterVolumeSpecName: "kube-api-access-9fttg") pod "ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" (UID: "ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8"). InnerVolumeSpecName "kube-api-access-9fttg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.333092 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" (UID: "ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.347189 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.347259 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.347271 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9fttg\" (UniqueName: \"kubernetes.io/projected/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8-kube-api-access-9fttg\") on node \"crc\" DevicePath \"\"" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.565279 5200 generic.go:358] "Generic (PLEG): container finished" podID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerID="1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1" exitCode=0 Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.565673 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck5jp" event={"ID":"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8","Type":"ContainerDied","Data":"1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1"} Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.565729 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ck5jp" event={"ID":"ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8","Type":"ContainerDied","Data":"6c5c7f427fd9d8d93df087b471c89cec06fa423feee72b870c5fb2e0ddd17591"} Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.565745 5200 scope.go:117] "RemoveContainer" containerID="1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.565885 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ck5jp" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.612429 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ck5jp"] Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.630151 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ck5jp"] Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.639198 5200 scope.go:117] "RemoveContainer" containerID="72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.669619 5200 scope.go:117] "RemoveContainer" containerID="150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.729385 5200 scope.go:117] "RemoveContainer" containerID="1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1" Nov 26 16:14:10 crc kubenswrapper[5200]: E1126 16:14:10.730142 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1\": container with ID starting with 1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1 not found: ID does not exist" containerID="1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.730175 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1"} err="failed to get container status \"1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1\": rpc error: code = NotFound desc = could not find container \"1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1\": container with ID starting with 1cbcfaf9e71e9a87342f7f71753fe04a855ff74fd222c77f55907ab2ebf934b1 not found: ID does not exist" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.730193 5200 scope.go:117] "RemoveContainer" containerID="72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c" Nov 26 16:14:10 crc kubenswrapper[5200]: E1126 16:14:10.730812 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c\": container with ID starting with 72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c not found: ID does not exist" containerID="72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.730834 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c"} err="failed to get container status \"72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c\": rpc error: code = NotFound desc = could not find container \"72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c\": container with ID starting with 72e3cb000bc6240e8d452a860f9f0971b8a962dfbd8d02234f79ed687cff178c not found: ID does not exist" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.730849 5200 scope.go:117] "RemoveContainer" containerID="150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9" Nov 26 16:14:10 crc kubenswrapper[5200]: E1126 16:14:10.731116 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9\": container with ID starting with 150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9 not found: ID does not exist" containerID="150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.731134 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9"} err="failed to get container status \"150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9\": rpc error: code = NotFound desc = could not find container \"150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9\": container with ID starting with 150c9d814f37a138bc4f624cb3a0b9762f99a54d7de5181ab8aefd31110ed9a9 not found: ID does not exist" Nov 26 16:14:10 crc kubenswrapper[5200]: I1126 16:14:10.987167 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" path="/var/lib/kubelet/pods/ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8/volumes" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.749572 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f8clw"] Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.752871 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="registry-server" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.752910 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="registry-server" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.753019 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="extract-utilities" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.753039 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="extract-utilities" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.753096 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="extract-content" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.753111 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="extract-content" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.753658 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebb3eea9-d0fa-423b-95e2-b22ef2e2f9a8" containerName="registry-server" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.772331 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8clw"] Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.772476 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.870134 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-catalog-content\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.870992 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqdl\" (UniqueName: \"kubernetes.io/projected/671094b8-f935-47f2-9a4b-0428479c9f21-kube-api-access-8wqdl\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.871098 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-utilities\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.972894 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqdl\" (UniqueName: \"kubernetes.io/projected/671094b8-f935-47f2-9a4b-0428479c9f21-kube-api-access-8wqdl\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.972970 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-utilities\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.973078 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-catalog-content\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.973540 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-utilities\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:14 crc kubenswrapper[5200]: I1126 16:14:14.973648 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-catalog-content\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:15 crc kubenswrapper[5200]: I1126 16:14:15.008195 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqdl\" (UniqueName: \"kubernetes.io/projected/671094b8-f935-47f2-9a4b-0428479c9f21-kube-api-access-8wqdl\") pod \"community-operators-f8clw\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:15 crc kubenswrapper[5200]: I1126 16:14:15.126443 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:15 crc kubenswrapper[5200]: I1126 16:14:15.697657 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8clw"] Nov 26 16:14:15 crc kubenswrapper[5200]: W1126 16:14:15.706811 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671094b8_f935_47f2_9a4b_0428479c9f21.slice/crio-75bcfa5642a9dc5f52ba28570fa091e6bf7beb81dc22a87f65949745ccd3c60d WatchSource:0}: Error finding container 75bcfa5642a9dc5f52ba28570fa091e6bf7beb81dc22a87f65949745ccd3c60d: Status 404 returned error can't find the container with id 75bcfa5642a9dc5f52ba28570fa091e6bf7beb81dc22a87f65949745ccd3c60d Nov 26 16:14:15 crc kubenswrapper[5200]: I1126 16:14:15.710504 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 16:14:16 crc kubenswrapper[5200]: I1126 16:14:16.642965 5200 generic.go:358] "Generic (PLEG): container finished" podID="671094b8-f935-47f2-9a4b-0428479c9f21" containerID="7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25" exitCode=0 Nov 26 16:14:16 crc kubenswrapper[5200]: I1126 16:14:16.643033 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8clw" event={"ID":"671094b8-f935-47f2-9a4b-0428479c9f21","Type":"ContainerDied","Data":"7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25"} Nov 26 16:14:16 crc kubenswrapper[5200]: I1126 16:14:16.643577 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8clw" event={"ID":"671094b8-f935-47f2-9a4b-0428479c9f21","Type":"ContainerStarted","Data":"75bcfa5642a9dc5f52ba28570fa091e6bf7beb81dc22a87f65949745ccd3c60d"} Nov 26 16:14:18 crc kubenswrapper[5200]: I1126 16:14:18.669422 5200 generic.go:358] "Generic (PLEG): container finished" podID="671094b8-f935-47f2-9a4b-0428479c9f21" containerID="9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22" exitCode=0 Nov 26 16:14:18 crc kubenswrapper[5200]: I1126 16:14:18.669545 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8clw" event={"ID":"671094b8-f935-47f2-9a4b-0428479c9f21","Type":"ContainerDied","Data":"9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22"} Nov 26 16:14:19 crc kubenswrapper[5200]: I1126 16:14:19.685546 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8clw" event={"ID":"671094b8-f935-47f2-9a4b-0428479c9f21","Type":"ContainerStarted","Data":"db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d"} Nov 26 16:14:19 crc kubenswrapper[5200]: I1126 16:14:19.713974 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f8clw" podStartSLOduration=4.8464577250000005 podStartE2EDuration="5.713957355s" podCreationTimestamp="2025-11-26 16:14:14 +0000 UTC" firstStartedPulling="2025-11-26 16:14:16.644587632 +0000 UTC m=+9825.323789450" lastFinishedPulling="2025-11-26 16:14:17.512087262 +0000 UTC m=+9826.191289080" observedRunningTime="2025-11-26 16:14:19.704154363 +0000 UTC m=+9828.383356201" watchObservedRunningTime="2025-11-26 16:14:19.713957355 +0000 UTC m=+9828.393159173" Nov 26 16:14:25 crc kubenswrapper[5200]: I1126 16:14:25.127485 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:25 crc kubenswrapper[5200]: I1126 16:14:25.128358 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:25 crc kubenswrapper[5200]: I1126 16:14:25.199599 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:25 crc kubenswrapper[5200]: I1126 16:14:25.855824 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:30 crc kubenswrapper[5200]: I1126 16:14:30.953169 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8clw"] Nov 26 16:14:30 crc kubenswrapper[5200]: I1126 16:14:30.954714 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f8clw" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" containerName="registry-server" containerID="cri-o://db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d" gracePeriod=2 Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.522318 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.597535 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-catalog-content\") pod \"671094b8-f935-47f2-9a4b-0428479c9f21\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.597946 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wqdl\" (UniqueName: \"kubernetes.io/projected/671094b8-f935-47f2-9a4b-0428479c9f21-kube-api-access-8wqdl\") pod \"671094b8-f935-47f2-9a4b-0428479c9f21\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.598078 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-utilities\") pod \"671094b8-f935-47f2-9a4b-0428479c9f21\" (UID: \"671094b8-f935-47f2-9a4b-0428479c9f21\") " Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.600182 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-utilities" (OuterVolumeSpecName: "utilities") pod "671094b8-f935-47f2-9a4b-0428479c9f21" (UID: "671094b8-f935-47f2-9a4b-0428479c9f21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.610934 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671094b8-f935-47f2-9a4b-0428479c9f21-kube-api-access-8wqdl" (OuterVolumeSpecName: "kube-api-access-8wqdl") pod "671094b8-f935-47f2-9a4b-0428479c9f21" (UID: "671094b8-f935-47f2-9a4b-0428479c9f21"). InnerVolumeSpecName "kube-api-access-8wqdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.663969 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "671094b8-f935-47f2-9a4b-0428479c9f21" (UID: "671094b8-f935-47f2-9a4b-0428479c9f21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.701043 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8wqdl\" (UniqueName: \"kubernetes.io/projected/671094b8-f935-47f2-9a4b-0428479c9f21-kube-api-access-8wqdl\") on node \"crc\" DevicePath \"\"" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.701325 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.701421 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671094b8-f935-47f2-9a4b-0428479c9f21-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.887526 5200 generic.go:358] "Generic (PLEG): container finished" podID="671094b8-f935-47f2-9a4b-0428479c9f21" containerID="db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d" exitCode=0 Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.888084 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8clw" event={"ID":"671094b8-f935-47f2-9a4b-0428479c9f21","Type":"ContainerDied","Data":"db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d"} Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.888124 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8clw" event={"ID":"671094b8-f935-47f2-9a4b-0428479c9f21","Type":"ContainerDied","Data":"75bcfa5642a9dc5f52ba28570fa091e6bf7beb81dc22a87f65949745ccd3c60d"} Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.888145 5200 scope.go:117] "RemoveContainer" containerID="db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.888355 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8clw" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.949712 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8clw"] Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.951803 5200 scope.go:117] "RemoveContainer" containerID="9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22" Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.958918 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f8clw"] Nov 26 16:14:31 crc kubenswrapper[5200]: I1126 16:14:31.987897 5200 scope.go:117] "RemoveContainer" containerID="7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25" Nov 26 16:14:32 crc kubenswrapper[5200]: I1126 16:14:32.058101 5200 scope.go:117] "RemoveContainer" containerID="db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d" Nov 26 16:14:32 crc kubenswrapper[5200]: E1126 16:14:32.058839 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d\": container with ID starting with db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d not found: ID does not exist" containerID="db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d" Nov 26 16:14:32 crc kubenswrapper[5200]: I1126 16:14:32.058884 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d"} err="failed to get container status \"db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d\": rpc error: code = NotFound desc = could not find container \"db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d\": container with ID starting with db902a5a9819d5a233a1c3d1d0a8cf3d57c5a7ca9b13091b3c9e9a0b163d953d not found: ID does not exist" Nov 26 16:14:32 crc kubenswrapper[5200]: I1126 16:14:32.058909 5200 scope.go:117] "RemoveContainer" containerID="9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22" Nov 26 16:14:32 crc kubenswrapper[5200]: E1126 16:14:32.059490 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22\": container with ID starting with 9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22 not found: ID does not exist" containerID="9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22" Nov 26 16:14:32 crc kubenswrapper[5200]: I1126 16:14:32.059522 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22"} err="failed to get container status \"9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22\": rpc error: code = NotFound desc = could not find container \"9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22\": container with ID starting with 9366295a0b459bd1fa7d5316791269fb16eea4b0d40db4379231d2b2ec324e22 not found: ID does not exist" Nov 26 16:14:32 crc kubenswrapper[5200]: I1126 16:14:32.059549 5200 scope.go:117] "RemoveContainer" containerID="7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25" Nov 26 16:14:32 crc kubenswrapper[5200]: E1126 16:14:32.059981 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25\": container with ID starting with 7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25 not found: ID does not exist" containerID="7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25" Nov 26 16:14:32 crc kubenswrapper[5200]: I1126 16:14:32.059999 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25"} err="failed to get container status \"7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25\": rpc error: code = NotFound desc = could not find container \"7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25\": container with ID starting with 7d563a3c9d5327ffc9725c03f5093bae6acef6ec5a7dc7fee0499562d9d9ae25 not found: ID does not exist" Nov 26 16:14:32 crc kubenswrapper[5200]: I1126 16:14:32.989199 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" path="/var/lib/kubelet/pods/671094b8-f935-47f2-9a4b-0428479c9f21/volumes" Nov 26 16:14:33 crc kubenswrapper[5200]: I1126 16:14:33.154083 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:14:33 crc kubenswrapper[5200]: I1126 16:14:33.154596 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:14:42 crc kubenswrapper[5200]: E1126 16:14:42.942946 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2433683 actualBytes=10240 Nov 26 16:14:45 crc kubenswrapper[5200]: I1126 16:14:45.033739 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Nov 26 16:14:45 crc kubenswrapper[5200]: I1126 16:14:45.035103 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="c13a2e17-c871-4417-80b1-e1c60cf6d64c" containerName="adoption" containerID="cri-o://0dac33125f4702659e50f0807691fe4181adc25b9fe47c074981155bbe9e189d" gracePeriod=30 Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.169547 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9"] Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.171728 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" containerName="extract-content" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.171747 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" containerName="extract-content" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.171754 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" containerName="registry-server" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.171762 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" containerName="registry-server" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.171810 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" containerName="extract-utilities" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.171816 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" containerName="extract-utilities" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.172045 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="671094b8-f935-47f2-9a4b-0428479c9f21" containerName="registry-server" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.180512 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.182905 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.183218 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.199513 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9"] Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.276563 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crxnr\" (UniqueName: \"kubernetes.io/projected/af1bdc39-4de6-4910-aca7-421db7fb3403-kube-api-access-crxnr\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.276752 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af1bdc39-4de6-4910-aca7-421db7fb3403-config-volume\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.277100 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af1bdc39-4de6-4910-aca7-421db7fb3403-secret-volume\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.379247 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af1bdc39-4de6-4910-aca7-421db7fb3403-config-volume\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.379503 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af1bdc39-4de6-4910-aca7-421db7fb3403-secret-volume\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.379853 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crxnr\" (UniqueName: \"kubernetes.io/projected/af1bdc39-4de6-4910-aca7-421db7fb3403-kube-api-access-crxnr\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.380106 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af1bdc39-4de6-4910-aca7-421db7fb3403-config-volume\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.387295 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af1bdc39-4de6-4910-aca7-421db7fb3403-secret-volume\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.399334 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crxnr\" (UniqueName: \"kubernetes.io/projected/af1bdc39-4de6-4910-aca7-421db7fb3403-kube-api-access-crxnr\") pod \"collect-profiles-29402895-hrbp9\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:00 crc kubenswrapper[5200]: I1126 16:15:00.508671 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:01 crc kubenswrapper[5200]: I1126 16:15:01.017394 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9"] Nov 26 16:15:01 crc kubenswrapper[5200]: I1126 16:15:01.260016 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" event={"ID":"af1bdc39-4de6-4910-aca7-421db7fb3403","Type":"ContainerStarted","Data":"53ff92d748b4ea39c28ad53ea526ea7a901514b710c398b7250c606f6d19414e"} Nov 26 16:15:01 crc kubenswrapper[5200]: I1126 16:15:01.261294 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" event={"ID":"af1bdc39-4de6-4910-aca7-421db7fb3403","Type":"ContainerStarted","Data":"15bd5fc57c9737a6c557953a514c55b7e30fd17193452dc9966f40eb80353d3c"} Nov 26 16:15:01 crc kubenswrapper[5200]: I1126 16:15:01.279407 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" podStartSLOduration=1.2793893729999999 podStartE2EDuration="1.279389373s" podCreationTimestamp="2025-11-26 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 16:15:01.275024767 +0000 UTC m=+9869.954226595" watchObservedRunningTime="2025-11-26 16:15:01.279389373 +0000 UTC m=+9869.958591181" Nov 26 16:15:02 crc kubenswrapper[5200]: I1126 16:15:02.273214 5200 generic.go:358] "Generic (PLEG): container finished" podID="af1bdc39-4de6-4910-aca7-421db7fb3403" containerID="53ff92d748b4ea39c28ad53ea526ea7a901514b710c398b7250c606f6d19414e" exitCode=0 Nov 26 16:15:02 crc kubenswrapper[5200]: I1126 16:15:02.273325 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" event={"ID":"af1bdc39-4de6-4910-aca7-421db7fb3403","Type":"ContainerDied","Data":"53ff92d748b4ea39c28ad53ea526ea7a901514b710c398b7250c606f6d19414e"} Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.154236 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.154585 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.681472 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.869371 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crxnr\" (UniqueName: \"kubernetes.io/projected/af1bdc39-4de6-4910-aca7-421db7fb3403-kube-api-access-crxnr\") pod \"af1bdc39-4de6-4910-aca7-421db7fb3403\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.869518 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af1bdc39-4de6-4910-aca7-421db7fb3403-secret-volume\") pod \"af1bdc39-4de6-4910-aca7-421db7fb3403\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.869716 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af1bdc39-4de6-4910-aca7-421db7fb3403-config-volume\") pod \"af1bdc39-4de6-4910-aca7-421db7fb3403\" (UID: \"af1bdc39-4de6-4910-aca7-421db7fb3403\") " Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.870757 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1bdc39-4de6-4910-aca7-421db7fb3403-config-volume" (OuterVolumeSpecName: "config-volume") pod "af1bdc39-4de6-4910-aca7-421db7fb3403" (UID: "af1bdc39-4de6-4910-aca7-421db7fb3403"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.871729 5200 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af1bdc39-4de6-4910-aca7-421db7fb3403-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.880263 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1bdc39-4de6-4910-aca7-421db7fb3403-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af1bdc39-4de6-4910-aca7-421db7fb3403" (UID: "af1bdc39-4de6-4910-aca7-421db7fb3403"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.880537 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1bdc39-4de6-4910-aca7-421db7fb3403-kube-api-access-crxnr" (OuterVolumeSpecName: "kube-api-access-crxnr") pod "af1bdc39-4de6-4910-aca7-421db7fb3403" (UID: "af1bdc39-4de6-4910-aca7-421db7fb3403"). InnerVolumeSpecName "kube-api-access-crxnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.975110 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crxnr\" (UniqueName: \"kubernetes.io/projected/af1bdc39-4de6-4910-aca7-421db7fb3403-kube-api-access-crxnr\") on node \"crc\" DevicePath \"\"" Nov 26 16:15:03 crc kubenswrapper[5200]: I1126 16:15:03.975725 5200 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af1bdc39-4de6-4910-aca7-421db7fb3403-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 16:15:04 crc kubenswrapper[5200]: I1126 16:15:04.302514 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" Nov 26 16:15:04 crc kubenswrapper[5200]: I1126 16:15:04.303482 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402895-hrbp9" event={"ID":"af1bdc39-4de6-4910-aca7-421db7fb3403","Type":"ContainerDied","Data":"15bd5fc57c9737a6c557953a514c55b7e30fd17193452dc9966f40eb80353d3c"} Nov 26 16:15:04 crc kubenswrapper[5200]: I1126 16:15:04.303545 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15bd5fc57c9737a6c557953a514c55b7e30fd17193452dc9966f40eb80353d3c" Nov 26 16:15:04 crc kubenswrapper[5200]: I1126 16:15:04.372046 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m"] Nov 26 16:15:04 crc kubenswrapper[5200]: I1126 16:15:04.384344 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402850-m2p8m"] Nov 26 16:15:04 crc kubenswrapper[5200]: I1126 16:15:04.991383 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cce5b96-9f16-45b4-be33-b50774e13c38" path="/var/lib/kubelet/pods/8cce5b96-9f16-45b4-be33-b50774e13c38/volumes" Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.432485 5200 generic.go:358] "Generic (PLEG): container finished" podID="c13a2e17-c871-4417-80b1-e1c60cf6d64c" containerID="0dac33125f4702659e50f0807691fe4181adc25b9fe47c074981155bbe9e189d" exitCode=137 Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.433012 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c13a2e17-c871-4417-80b1-e1c60cf6d64c","Type":"ContainerDied","Data":"0dac33125f4702659e50f0807691fe4181adc25b9fe47c074981155bbe9e189d"} Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.567028 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.605519 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\") pod \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") " Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.605965 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlnnp\" (UniqueName: \"kubernetes.io/projected/c13a2e17-c871-4417-80b1-e1c60cf6d64c-kube-api-access-jlnnp\") pod \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\" (UID: \"c13a2e17-c871-4417-80b1-e1c60cf6d64c\") " Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.631189 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13a2e17-c871-4417-80b1-e1c60cf6d64c-kube-api-access-jlnnp" (OuterVolumeSpecName: "kube-api-access-jlnnp") pod "c13a2e17-c871-4417-80b1-e1c60cf6d64c" (UID: "c13a2e17-c871-4417-80b1-e1c60cf6d64c"). InnerVolumeSpecName "kube-api-access-jlnnp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.637564 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86" (OuterVolumeSpecName: "mariadb-data") pod "c13a2e17-c871-4417-80b1-e1c60cf6d64c" (UID: "c13a2e17-c871-4417-80b1-e1c60cf6d64c"). InnerVolumeSpecName "pvc-d8b46129-29e3-4a22-9026-942b95d0ca86". PluginName "kubernetes.io/csi", VolumeGIDValue "" Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.709678 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\") on node \"crc\" " Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.709745 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jlnnp\" (UniqueName: \"kubernetes.io/projected/c13a2e17-c871-4417-80b1-e1c60cf6d64c-kube-api-access-jlnnp\") on node \"crc\" DevicePath \"\"" Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.748886 5200 csi_attacher.go:567] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.749378 5200 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-d8b46129-29e3-4a22-9026-942b95d0ca86" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86") on node "crc" Nov 26 16:15:15 crc kubenswrapper[5200]: I1126 16:15:15.812395 5200 reconciler_common.go:299] "Volume detached for volume \"pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8b46129-29e3-4a22-9026-942b95d0ca86\") on node \"crc\" DevicePath \"\"" Nov 26 16:15:16 crc kubenswrapper[5200]: I1126 16:15:16.455796 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 26 16:15:16 crc kubenswrapper[5200]: I1126 16:15:16.455815 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c13a2e17-c871-4417-80b1-e1c60cf6d64c","Type":"ContainerDied","Data":"84039a7c1a282518b151f3debcdb075cd8e7ae197d6795c83b4230064aff255b"} Nov 26 16:15:16 crc kubenswrapper[5200]: I1126 16:15:16.455873 5200 scope.go:117] "RemoveContainer" containerID="0dac33125f4702659e50f0807691fe4181adc25b9fe47c074981155bbe9e189d" Nov 26 16:15:16 crc kubenswrapper[5200]: I1126 16:15:16.502325 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Nov 26 16:15:16 crc kubenswrapper[5200]: I1126 16:15:16.512845 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Nov 26 16:15:16 crc kubenswrapper[5200]: I1126 16:15:16.982430 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13a2e17-c871-4417-80b1-e1c60cf6d64c" path="/var/lib/kubelet/pods/c13a2e17-c871-4417-80b1-e1c60cf6d64c/volumes" Nov 26 16:15:17 crc kubenswrapper[5200]: I1126 16:15:17.168138 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Nov 26 16:15:17 crc kubenswrapper[5200]: I1126 16:15:17.168389 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" containerName="adoption" containerID="cri-o://c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86" gracePeriod=30 Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.154925 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.155578 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.155636 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.156621 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.156705 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" gracePeriod=600 Nov 26 16:15:33 crc kubenswrapper[5200]: E1126 16:15:33.324817 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.666150 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" exitCode=0 Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.666528 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea"} Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.666658 5200 scope.go:117] "RemoveContainer" containerID="1b7c4bf53fa7c03168795e76121507c4f8b74156fda0d302b44e19ecc62c9c75" Nov 26 16:15:33 crc kubenswrapper[5200]: I1126 16:15:33.667825 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:15:33 crc kubenswrapper[5200]: E1126 16:15:33.668596 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:15:36 crc kubenswrapper[5200]: I1126 16:15:36.515528 5200 scope.go:117] "RemoveContainer" containerID="7aa4b6f9112e979cb51369f10095e85ab9f6f56c05c02d447d406b9edc51b428" Nov 26 16:15:39 crc kubenswrapper[5200]: I1126 16:15:39.908671 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:15:39 crc kubenswrapper[5200]: I1126 16:15:39.914146 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:15:39 crc kubenswrapper[5200]: I1126 16:15:39.917188 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:15:39 crc kubenswrapper[5200]: I1126 16:15:39.920834 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:15:43 crc kubenswrapper[5200]: E1126 16:15:43.223908 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2343857 actualBytes=10240 Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.789116 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.854481 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-ovn-data-cert\") pod \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.854542 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt9q9\" (UniqueName: \"kubernetes.io/projected/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-kube-api-access-pt9q9\") pod \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.855784 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178\") pod \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\" (UID: \"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff\") " Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.862253 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" (UID: "9cdf2dbb-d95e-4aae-abcf-9d23cde283ff"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.867139 5200 generic.go:358] "Generic (PLEG): container finished" podID="9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" containerID="c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86" exitCode=137 Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.867474 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff","Type":"ContainerDied","Data":"c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86"} Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.867511 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"9cdf2dbb-d95e-4aae-abcf-9d23cde283ff","Type":"ContainerDied","Data":"619d52b0308db8414497f3cd66903ddc4ef97a0cd9537664b42beb7c3916bfa4"} Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.867531 5200 scope.go:117] "RemoveContainer" containerID="c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.867790 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.869799 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-kube-api-access-pt9q9" (OuterVolumeSpecName: "kube-api-access-pt9q9") pod "9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" (UID: "9cdf2dbb-d95e-4aae-abcf-9d23cde283ff"). InnerVolumeSpecName "kube-api-access-pt9q9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.875952 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178" (OuterVolumeSpecName: "ovn-data") pod "9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" (UID: "9cdf2dbb-d95e-4aae-abcf-9d23cde283ff"). InnerVolumeSpecName "pvc-856a6966-322f-4462-b2fc-01a35ef27178". PluginName "kubernetes.io/csi", VolumeGIDValue "" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.958758 5200 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-856a6966-322f-4462-b2fc-01a35ef27178\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178\") on node \"crc\" " Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.959001 5200 reconciler_common.go:299] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.959087 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pt9q9\" (UniqueName: \"kubernetes.io/projected/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff-kube-api-access-pt9q9\") on node \"crc\" DevicePath \"\"" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.977024 5200 scope.go:117] "RemoveContainer" containerID="c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86" Nov 26 16:15:47 crc kubenswrapper[5200]: E1126 16:15:47.977455 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86\": container with ID starting with c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86 not found: ID does not exist" containerID="c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.977488 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86"} err="failed to get container status \"c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86\": rpc error: code = NotFound desc = could not find container \"c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86\": container with ID starting with c0c3e7990532e7148a2e67d1ce78fac055e92da95db1f8c142d383b382156d86 not found: ID does not exist" Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.984410 5200 csi_attacher.go:567] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 26 16:15:47 crc kubenswrapper[5200]: I1126 16:15:47.984545 5200 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-856a6966-322f-4462-b2fc-01a35ef27178" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178") on node "crc" Nov 26 16:15:48 crc kubenswrapper[5200]: I1126 16:15:48.062110 5200 reconciler_common.go:299] "Volume detached for volume \"pvc-856a6966-322f-4462-b2fc-01a35ef27178\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-856a6966-322f-4462-b2fc-01a35ef27178\") on node \"crc\" DevicePath \"\"" Nov 26 16:15:48 crc kubenswrapper[5200]: I1126 16:15:48.204018 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Nov 26 16:15:48 crc kubenswrapper[5200]: I1126 16:15:48.213319 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Nov 26 16:15:48 crc kubenswrapper[5200]: I1126 16:15:48.970295 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:15:48 crc kubenswrapper[5200]: E1126 16:15:48.971182 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:15:48 crc kubenswrapper[5200]: I1126 16:15:48.979920 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" path="/var/lib/kubelet/pods/9cdf2dbb-d95e-4aae-abcf-9d23cde283ff/volumes" Nov 26 16:15:59 crc kubenswrapper[5200]: I1126 16:15:59.969406 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:15:59 crc kubenswrapper[5200]: E1126 16:15:59.970436 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:16:11 crc kubenswrapper[5200]: I1126 16:16:11.970182 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:16:11 crc kubenswrapper[5200]: E1126 16:16:11.971326 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:16:25 crc kubenswrapper[5200]: I1126 16:16:25.970455 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:16:25 crc kubenswrapper[5200]: E1126 16:16:25.971660 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:16:38 crc kubenswrapper[5200]: I1126 16:16:38.970505 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:16:38 crc kubenswrapper[5200]: E1126 16:16:38.971706 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:16:42 crc kubenswrapper[5200]: E1126 16:16:42.690006 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2336055 actualBytes=10240 Nov 26 16:16:52 crc kubenswrapper[5200]: I1126 16:16:52.969859 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:16:52 crc kubenswrapper[5200]: E1126 16:16:52.971031 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.671720 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pb99x/must-gather-w2ntz"] Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.673698 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c13a2e17-c871-4417-80b1-e1c60cf6d64c" containerName="adoption" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.673718 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13a2e17-c871-4417-80b1-e1c60cf6d64c" containerName="adoption" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.673739 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af1bdc39-4de6-4910-aca7-421db7fb3403" containerName="collect-profiles" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.673745 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1bdc39-4de6-4910-aca7-421db7fb3403" containerName="collect-profiles" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.673772 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" containerName="adoption" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.673780 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" containerName="adoption" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.674106 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="af1bdc39-4de6-4910-aca7-421db7fb3403" containerName="collect-profiles" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.674125 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c13a2e17-c871-4417-80b1-e1c60cf6d64c" containerName="adoption" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.674143 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cdf2dbb-d95e-4aae-abcf-9d23cde283ff" containerName="adoption" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.681098 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.684026 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pb99x\"/\"openshift-service-ca.crt\"" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.684282 5200 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pb99x\"/\"kube-root-ca.crt\"" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.716305 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pb99x/must-gather-w2ntz"] Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.791924 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv4vs\" (UniqueName: \"kubernetes.io/projected/086fc523-d4af-43d0-8c12-b3cf4693071b-kube-api-access-vv4vs\") pod \"must-gather-w2ntz\" (UID: \"086fc523-d4af-43d0-8c12-b3cf4693071b\") " pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.792013 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/086fc523-d4af-43d0-8c12-b3cf4693071b-must-gather-output\") pod \"must-gather-w2ntz\" (UID: \"086fc523-d4af-43d0-8c12-b3cf4693071b\") " pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.893756 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv4vs\" (UniqueName: \"kubernetes.io/projected/086fc523-d4af-43d0-8c12-b3cf4693071b-kube-api-access-vv4vs\") pod \"must-gather-w2ntz\" (UID: \"086fc523-d4af-43d0-8c12-b3cf4693071b\") " pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.893831 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/086fc523-d4af-43d0-8c12-b3cf4693071b-must-gather-output\") pod \"must-gather-w2ntz\" (UID: \"086fc523-d4af-43d0-8c12-b3cf4693071b\") " pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.894307 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/086fc523-d4af-43d0-8c12-b3cf4693071b-must-gather-output\") pod \"must-gather-w2ntz\" (UID: \"086fc523-d4af-43d0-8c12-b3cf4693071b\") " pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:16:54 crc kubenswrapper[5200]: I1126 16:16:54.913759 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv4vs\" (UniqueName: \"kubernetes.io/projected/086fc523-d4af-43d0-8c12-b3cf4693071b-kube-api-access-vv4vs\") pod \"must-gather-w2ntz\" (UID: \"086fc523-d4af-43d0-8c12-b3cf4693071b\") " pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:16:55 crc kubenswrapper[5200]: I1126 16:16:55.009121 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:16:55 crc kubenswrapper[5200]: I1126 16:16:55.519628 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pb99x/must-gather-w2ntz"] Nov 26 16:16:55 crc kubenswrapper[5200]: I1126 16:16:55.682997 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/must-gather-w2ntz" event={"ID":"086fc523-d4af-43d0-8c12-b3cf4693071b","Type":"ContainerStarted","Data":"bd209943d03cb32038ee2d3ca55652e99324bdf2fd32b3ab7d92b5c1384dc27d"} Nov 26 16:17:02 crc kubenswrapper[5200]: I1126 16:17:02.787458 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/must-gather-w2ntz" event={"ID":"086fc523-d4af-43d0-8c12-b3cf4693071b","Type":"ContainerStarted","Data":"a17d7a6b9b69255e80476dfd1ac5f4257eb563469de03e0811426dc0ce514a81"} Nov 26 16:17:02 crc kubenswrapper[5200]: I1126 16:17:02.788359 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/must-gather-w2ntz" event={"ID":"086fc523-d4af-43d0-8c12-b3cf4693071b","Type":"ContainerStarted","Data":"83d8aeb4add5b13b6057b7b44eea17e51da6b9ce1cb30cac19a9fdd17ce7d540"} Nov 26 16:17:02 crc kubenswrapper[5200]: I1126 16:17:02.820243 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pb99x/must-gather-w2ntz" podStartSLOduration=2.901355768 podStartE2EDuration="8.820216248s" podCreationTimestamp="2025-11-26 16:16:54 +0000 UTC" firstStartedPulling="2025-11-26 16:16:55.530753811 +0000 UTC m=+9984.209955619" lastFinishedPulling="2025-11-26 16:17:01.449614251 +0000 UTC m=+9990.128816099" observedRunningTime="2025-11-26 16:17:02.814741922 +0000 UTC m=+9991.493943800" watchObservedRunningTime="2025-11-26 16:17:02.820216248 +0000 UTC m=+9991.499418106" Nov 26 16:17:04 crc kubenswrapper[5200]: E1126 16:17:04.286386 5200 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.224:42718->38.102.83.224:42551: write tcp 38.102.83.224:42718->38.102.83.224:42551: write: connection reset by peer Nov 26 16:17:04 crc kubenswrapper[5200]: E1126 16:17:04.843870 5200 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.224:42858->38.102.83.224:42551: write tcp 38.102.83.224:42858->38.102.83.224:42551: write: broken pipe Nov 26 16:17:04 crc kubenswrapper[5200]: I1126 16:17:04.975994 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:17:04 crc kubenswrapper[5200]: E1126 16:17:04.988114 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.609085 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pb99x/crc-debug-n9bgb"] Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.618204 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.622312 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pb99x\"/\"default-dockercfg-vbsk7\"" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.785739 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvpzq\" (UniqueName: \"kubernetes.io/projected/ab526938-5c97-4ecf-a890-9101d8f8b17c-kube-api-access-mvpzq\") pod \"crc-debug-n9bgb\" (UID: \"ab526938-5c97-4ecf-a890-9101d8f8b17c\") " pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.785815 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab526938-5c97-4ecf-a890-9101d8f8b17c-host\") pod \"crc-debug-n9bgb\" (UID: \"ab526938-5c97-4ecf-a890-9101d8f8b17c\") " pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.888106 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvpzq\" (UniqueName: \"kubernetes.io/projected/ab526938-5c97-4ecf-a890-9101d8f8b17c-kube-api-access-mvpzq\") pod \"crc-debug-n9bgb\" (UID: \"ab526938-5c97-4ecf-a890-9101d8f8b17c\") " pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.888167 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab526938-5c97-4ecf-a890-9101d8f8b17c-host\") pod \"crc-debug-n9bgb\" (UID: \"ab526938-5c97-4ecf-a890-9101d8f8b17c\") " pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.888411 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab526938-5c97-4ecf-a890-9101d8f8b17c-host\") pod \"crc-debug-n9bgb\" (UID: \"ab526938-5c97-4ecf-a890-9101d8f8b17c\") " pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.917427 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvpzq\" (UniqueName: \"kubernetes.io/projected/ab526938-5c97-4ecf-a890-9101d8f8b17c-kube-api-access-mvpzq\") pod \"crc-debug-n9bgb\" (UID: \"ab526938-5c97-4ecf-a890-9101d8f8b17c\") " pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:05 crc kubenswrapper[5200]: I1126 16:17:05.940766 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:05 crc kubenswrapper[5200]: W1126 16:17:05.980256 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab526938_5c97_4ecf_a890_9101d8f8b17c.slice/crio-28bd47864ec2b35fbb544522a817def0949a6d8376925364b6e1fa3beb7103fa WatchSource:0}: Error finding container 28bd47864ec2b35fbb544522a817def0949a6d8376925364b6e1fa3beb7103fa: Status 404 returned error can't find the container with id 28bd47864ec2b35fbb544522a817def0949a6d8376925364b6e1fa3beb7103fa Nov 26 16:17:06 crc kubenswrapper[5200]: I1126 16:17:06.867955 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" event={"ID":"ab526938-5c97-4ecf-a890-9101d8f8b17c","Type":"ContainerStarted","Data":"28bd47864ec2b35fbb544522a817def0949a6d8376925364b6e1fa3beb7103fa"} Nov 26 16:17:17 crc kubenswrapper[5200]: I1126 16:17:17.970341 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:17:17 crc kubenswrapper[5200]: E1126 16:17:17.971668 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.135843 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kr7zf"] Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.189744 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kr7zf"] Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.189942 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.254328 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d65ae68-9754-4ce6-b600-6524b4424230-utilities\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.254843 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d65ae68-9754-4ce6-b600-6524b4424230-catalog-content\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.255088 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfmt\" (UniqueName: \"kubernetes.io/projected/3d65ae68-9754-4ce6-b600-6524b4424230-kube-api-access-mmfmt\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.356815 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfmt\" (UniqueName: \"kubernetes.io/projected/3d65ae68-9754-4ce6-b600-6524b4424230-kube-api-access-mmfmt\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.356975 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d65ae68-9754-4ce6-b600-6524b4424230-utilities\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.357122 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d65ae68-9754-4ce6-b600-6524b4424230-catalog-content\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.357643 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d65ae68-9754-4ce6-b600-6524b4424230-utilities\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.357717 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d65ae68-9754-4ce6-b600-6524b4424230-catalog-content\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.415663 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfmt\" (UniqueName: \"kubernetes.io/projected/3d65ae68-9754-4ce6-b600-6524b4424230-kube-api-access-mmfmt\") pod \"certified-operators-kr7zf\" (UID: \"3d65ae68-9754-4ce6-b600-6524b4424230\") " pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:18 crc kubenswrapper[5200]: I1126 16:17:18.521865 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:19 crc kubenswrapper[5200]: I1126 16:17:19.003432 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" event={"ID":"ab526938-5c97-4ecf-a890-9101d8f8b17c","Type":"ContainerStarted","Data":"51b884e419e756562236ae78942a7550f41d274e1224658c98bce092eb8f3913"} Nov 26 16:17:19 crc kubenswrapper[5200]: I1126 16:17:19.022593 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" podStartSLOduration=1.390121956 podStartE2EDuration="14.02257357s" podCreationTimestamp="2025-11-26 16:17:05 +0000 UTC" firstStartedPulling="2025-11-26 16:17:05.984802188 +0000 UTC m=+9994.664004006" lastFinishedPulling="2025-11-26 16:17:18.617253802 +0000 UTC m=+10007.296455620" observedRunningTime="2025-11-26 16:17:19.016440987 +0000 UTC m=+10007.695642805" watchObservedRunningTime="2025-11-26 16:17:19.02257357 +0000 UTC m=+10007.701775378" Nov 26 16:17:19 crc kubenswrapper[5200]: I1126 16:17:19.042843 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kr7zf"] Nov 26 16:17:19 crc kubenswrapper[5200]: W1126 16:17:19.064606 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d65ae68_9754_4ce6_b600_6524b4424230.slice/crio-a5f40951520f0026b2b5568c8b0381108d2c6291233ee61e3d164461902377c6 WatchSource:0}: Error finding container a5f40951520f0026b2b5568c8b0381108d2c6291233ee61e3d164461902377c6: Status 404 returned error can't find the container with id a5f40951520f0026b2b5568c8b0381108d2c6291233ee61e3d164461902377c6 Nov 26 16:17:20 crc kubenswrapper[5200]: I1126 16:17:20.022427 5200 generic.go:358] "Generic (PLEG): container finished" podID="3d65ae68-9754-4ce6-b600-6524b4424230" containerID="11543e38605bbc353051a563f2a71d85e87b468709dec0863bcd7333c4fb670b" exitCode=0 Nov 26 16:17:20 crc kubenswrapper[5200]: I1126 16:17:20.025305 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr7zf" event={"ID":"3d65ae68-9754-4ce6-b600-6524b4424230","Type":"ContainerDied","Data":"11543e38605bbc353051a563f2a71d85e87b468709dec0863bcd7333c4fb670b"} Nov 26 16:17:20 crc kubenswrapper[5200]: I1126 16:17:20.025340 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr7zf" event={"ID":"3d65ae68-9754-4ce6-b600-6524b4424230","Type":"ContainerStarted","Data":"a5f40951520f0026b2b5568c8b0381108d2c6291233ee61e3d164461902377c6"} Nov 26 16:17:25 crc kubenswrapper[5200]: I1126 16:17:25.084798 5200 generic.go:358] "Generic (PLEG): container finished" podID="3d65ae68-9754-4ce6-b600-6524b4424230" containerID="e82026b6ec1b13816f2d6b898d0e128df62baaa8e3f2dd2a08a2e546641927b9" exitCode=0 Nov 26 16:17:25 crc kubenswrapper[5200]: I1126 16:17:25.084873 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr7zf" event={"ID":"3d65ae68-9754-4ce6-b600-6524b4424230","Type":"ContainerDied","Data":"e82026b6ec1b13816f2d6b898d0e128df62baaa8e3f2dd2a08a2e546641927b9"} Nov 26 16:17:26 crc kubenswrapper[5200]: I1126 16:17:26.114047 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr7zf" event={"ID":"3d65ae68-9754-4ce6-b600-6524b4424230","Type":"ContainerStarted","Data":"99f844748bb93a13cb248545934c73c39cc317989dcd5c6abdf2829a5d4abeab"} Nov 26 16:17:26 crc kubenswrapper[5200]: I1126 16:17:26.137465 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kr7zf" podStartSLOduration=4.269312895 podStartE2EDuration="8.137446004s" podCreationTimestamp="2025-11-26 16:17:18 +0000 UTC" firstStartedPulling="2025-11-26 16:17:20.02514882 +0000 UTC m=+10008.704350648" lastFinishedPulling="2025-11-26 16:17:23.893281939 +0000 UTC m=+10012.572483757" observedRunningTime="2025-11-26 16:17:26.133012166 +0000 UTC m=+10014.812213994" watchObservedRunningTime="2025-11-26 16:17:26.137446004 +0000 UTC m=+10014.816647822" Nov 26 16:17:28 crc kubenswrapper[5200]: I1126 16:17:28.522893 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:28 crc kubenswrapper[5200]: I1126 16:17:28.523477 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:28 crc kubenswrapper[5200]: I1126 16:17:28.570366 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:30 crc kubenswrapper[5200]: I1126 16:17:30.970258 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:17:30 crc kubenswrapper[5200]: E1126 16:17:30.970854 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:17:39 crc kubenswrapper[5200]: I1126 16:17:39.206314 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kr7zf" Nov 26 16:17:39 crc kubenswrapper[5200]: I1126 16:17:39.282418 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kr7zf"] Nov 26 16:17:39 crc kubenswrapper[5200]: I1126 16:17:39.325833 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wdpbr"] Nov 26 16:17:39 crc kubenswrapper[5200]: I1126 16:17:39.326107 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wdpbr" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerName="registry-server" containerID="cri-o://2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c" gracePeriod=2 Nov 26 16:17:39 crc kubenswrapper[5200]: I1126 16:17:39.955168 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.082963 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-utilities\") pod \"395d7072-6b17-40a6-9969-3a5ddf3de469\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.083254 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-catalog-content\") pod \"395d7072-6b17-40a6-9969-3a5ddf3de469\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.083405 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwb2c\" (UniqueName: \"kubernetes.io/projected/395d7072-6b17-40a6-9969-3a5ddf3de469-kube-api-access-jwb2c\") pod \"395d7072-6b17-40a6-9969-3a5ddf3de469\" (UID: \"395d7072-6b17-40a6-9969-3a5ddf3de469\") " Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.085047 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-utilities" (OuterVolumeSpecName: "utilities") pod "395d7072-6b17-40a6-9969-3a5ddf3de469" (UID: "395d7072-6b17-40a6-9969-3a5ddf3de469"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.086182 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.090655 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395d7072-6b17-40a6-9969-3a5ddf3de469-kube-api-access-jwb2c" (OuterVolumeSpecName: "kube-api-access-jwb2c") pod "395d7072-6b17-40a6-9969-3a5ddf3de469" (UID: "395d7072-6b17-40a6-9969-3a5ddf3de469"). InnerVolumeSpecName "kube-api-access-jwb2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.126043 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "395d7072-6b17-40a6-9969-3a5ddf3de469" (UID: "395d7072-6b17-40a6-9969-3a5ddf3de469"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.187985 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwb2c\" (UniqueName: \"kubernetes.io/projected/395d7072-6b17-40a6-9969-3a5ddf3de469-kube-api-access-jwb2c\") on node \"crc\" DevicePath \"\"" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.188363 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395d7072-6b17-40a6-9969-3a5ddf3de469-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.264495 5200 generic.go:358] "Generic (PLEG): container finished" podID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerID="2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c" exitCode=0 Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.264604 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wdpbr" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.264610 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdpbr" event={"ID":"395d7072-6b17-40a6-9969-3a5ddf3de469","Type":"ContainerDied","Data":"2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c"} Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.264664 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wdpbr" event={"ID":"395d7072-6b17-40a6-9969-3a5ddf3de469","Type":"ContainerDied","Data":"1a1e2ab9a26a4f9b08c5b12194eae8b335351a3b8d7a98238bfa42a8aaef91ae"} Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.264695 5200 scope.go:117] "RemoveContainer" containerID="2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.291461 5200 scope.go:117] "RemoveContainer" containerID="2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.309861 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wdpbr"] Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.318259 5200 scope.go:117] "RemoveContainer" containerID="8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.324602 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wdpbr"] Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.360979 5200 scope.go:117] "RemoveContainer" containerID="2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c" Nov 26 16:17:40 crc kubenswrapper[5200]: E1126 16:17:40.361349 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c\": container with ID starting with 2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c not found: ID does not exist" containerID="2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.361379 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c"} err="failed to get container status \"2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c\": rpc error: code = NotFound desc = could not find container \"2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c\": container with ID starting with 2ce6be860efda6fdd4f7c148ef6b7296e73de80f09e6c08a769824b4d96a7a6c not found: ID does not exist" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.361397 5200 scope.go:117] "RemoveContainer" containerID="2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd" Nov 26 16:17:40 crc kubenswrapper[5200]: E1126 16:17:40.361723 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd\": container with ID starting with 2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd not found: ID does not exist" containerID="2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.361739 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd"} err="failed to get container status \"2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd\": rpc error: code = NotFound desc = could not find container \"2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd\": container with ID starting with 2d4ce0a24ab94a9e67b3d02e8439080144a328590e6aac995b54bbdd4ec86acd not found: ID does not exist" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.361750 5200 scope.go:117] "RemoveContainer" containerID="8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4" Nov 26 16:17:40 crc kubenswrapper[5200]: E1126 16:17:40.362015 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4\": container with ID starting with 8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4 not found: ID does not exist" containerID="8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.362032 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4"} err="failed to get container status \"8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4\": rpc error: code = NotFound desc = could not find container \"8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4\": container with ID starting with 8307cd294da54b7eb63d1ef0ba12c2989cd9a448b06cdbe49ce2bbcfc535b7f4 not found: ID does not exist" Nov 26 16:17:40 crc kubenswrapper[5200]: I1126 16:17:40.995350 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" path="/var/lib/kubelet/pods/395d7072-6b17-40a6-9969-3a5ddf3de469/volumes" Nov 26 16:17:42 crc kubenswrapper[5200]: E1126 16:17:42.724442 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2345396 actualBytes=10240 Nov 26 16:17:42 crc kubenswrapper[5200]: I1126 16:17:42.984742 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:17:42 crc kubenswrapper[5200]: E1126 16:17:42.985093 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:17:46 crc kubenswrapper[5200]: I1126 16:17:46.328728 5200 generic.go:358] "Generic (PLEG): container finished" podID="ab526938-5c97-4ecf-a890-9101d8f8b17c" containerID="51b884e419e756562236ae78942a7550f41d274e1224658c98bce092eb8f3913" exitCode=0 Nov 26 16:17:46 crc kubenswrapper[5200]: I1126 16:17:46.328805 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" event={"ID":"ab526938-5c97-4ecf-a890-9101d8f8b17c","Type":"ContainerDied","Data":"51b884e419e756562236ae78942a7550f41d274e1224658c98bce092eb8f3913"} Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.495354 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.547825 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pb99x/crc-debug-n9bgb"] Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.557321 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab526938-5c97-4ecf-a890-9101d8f8b17c-host\") pod \"ab526938-5c97-4ecf-a890-9101d8f8b17c\" (UID: \"ab526938-5c97-4ecf-a890-9101d8f8b17c\") " Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.557378 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvpzq\" (UniqueName: \"kubernetes.io/projected/ab526938-5c97-4ecf-a890-9101d8f8b17c-kube-api-access-mvpzq\") pod \"ab526938-5c97-4ecf-a890-9101d8f8b17c\" (UID: \"ab526938-5c97-4ecf-a890-9101d8f8b17c\") " Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.557433 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab526938-5c97-4ecf-a890-9101d8f8b17c-host" (OuterVolumeSpecName: "host") pod "ab526938-5c97-4ecf-a890-9101d8f8b17c" (UID: "ab526938-5c97-4ecf-a890-9101d8f8b17c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.557925 5200 reconciler_common.go:299] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab526938-5c97-4ecf-a890-9101d8f8b17c-host\") on node \"crc\" DevicePath \"\"" Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.564146 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab526938-5c97-4ecf-a890-9101d8f8b17c-kube-api-access-mvpzq" (OuterVolumeSpecName: "kube-api-access-mvpzq") pod "ab526938-5c97-4ecf-a890-9101d8f8b17c" (UID: "ab526938-5c97-4ecf-a890-9101d8f8b17c"). InnerVolumeSpecName "kube-api-access-mvpzq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.565931 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pb99x/crc-debug-n9bgb"] Nov 26 16:17:47 crc kubenswrapper[5200]: I1126 16:17:47.659907 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvpzq\" (UniqueName: \"kubernetes.io/projected/ab526938-5c97-4ecf-a890-9101d8f8b17c-kube-api-access-mvpzq\") on node \"crc\" DevicePath \"\"" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.354499 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.354505 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28bd47864ec2b35fbb544522a817def0949a6d8376925364b6e1fa3beb7103fa" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.377670 5200 status_manager.go:895] "Failed to get status for pod" podUID="ab526938-5c97-4ecf-a890-9101d8f8b17c" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" err="pods \"crc-debug-n9bgb\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pb99x\": no relationship found between node 'crc' and this object" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.694656 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pb99x/crc-debug-9p5nl"] Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696084 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerName="extract-content" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696104 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerName="extract-content" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696133 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerName="registry-server" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696139 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerName="registry-server" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696168 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerName="extract-utilities" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696174 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerName="extract-utilities" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696207 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab526938-5c97-4ecf-a890-9101d8f8b17c" containerName="container-00" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696214 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab526938-5c97-4ecf-a890-9101d8f8b17c" containerName="container-00" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696387 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="395d7072-6b17-40a6-9969-3a5ddf3de469" containerName="registry-server" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.696408 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab526938-5c97-4ecf-a890-9101d8f8b17c" containerName="container-00" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.701138 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.703787 5200 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pb99x\"/\"default-dockercfg-vbsk7\"" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.718012 5200 status_manager.go:895] "Failed to get status for pod" podUID="ab526938-5c97-4ecf-a890-9101d8f8b17c" pod="openshift-must-gather-pb99x/crc-debug-n9bgb" err="pods \"crc-debug-n9bgb\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pb99x\": no relationship found between node 'crc' and this object" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.786813 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1b627a0-d86e-433c-b13e-de1b770be0f0-host\") pod \"crc-debug-9p5nl\" (UID: \"c1b627a0-d86e-433c-b13e-de1b770be0f0\") " pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.786927 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgvv8\" (UniqueName: \"kubernetes.io/projected/c1b627a0-d86e-433c-b13e-de1b770be0f0-kube-api-access-wgvv8\") pod \"crc-debug-9p5nl\" (UID: \"c1b627a0-d86e-433c-b13e-de1b770be0f0\") " pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.889356 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1b627a0-d86e-433c-b13e-de1b770be0f0-host\") pod \"crc-debug-9p5nl\" (UID: \"c1b627a0-d86e-433c-b13e-de1b770be0f0\") " pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.889425 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgvv8\" (UniqueName: \"kubernetes.io/projected/c1b627a0-d86e-433c-b13e-de1b770be0f0-kube-api-access-wgvv8\") pod \"crc-debug-9p5nl\" (UID: \"c1b627a0-d86e-433c-b13e-de1b770be0f0\") " pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.889964 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1b627a0-d86e-433c-b13e-de1b770be0f0-host\") pod \"crc-debug-9p5nl\" (UID: \"c1b627a0-d86e-433c-b13e-de1b770be0f0\") " pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.916443 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgvv8\" (UniqueName: \"kubernetes.io/projected/c1b627a0-d86e-433c-b13e-de1b770be0f0-kube-api-access-wgvv8\") pod \"crc-debug-9p5nl\" (UID: \"c1b627a0-d86e-433c-b13e-de1b770be0f0\") " pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:48 crc kubenswrapper[5200]: I1126 16:17:48.983232 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab526938-5c97-4ecf-a890-9101d8f8b17c" path="/var/lib/kubelet/pods/ab526938-5c97-4ecf-a890-9101d8f8b17c/volumes" Nov 26 16:17:49 crc kubenswrapper[5200]: I1126 16:17:49.016840 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:49 crc kubenswrapper[5200]: I1126 16:17:49.365898 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" event={"ID":"c1b627a0-d86e-433c-b13e-de1b770be0f0","Type":"ContainerStarted","Data":"d2485ed694d4ff156bd3ffe69303f455822a50ed6d97cd8937cd4e87212079e8"} Nov 26 16:17:49 crc kubenswrapper[5200]: I1126 16:17:49.366250 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" event={"ID":"c1b627a0-d86e-433c-b13e-de1b770be0f0","Type":"ContainerStarted","Data":"96ac60cccdfd00b285735178ad89a6f3ccd0f3a639c6440698afee9764d4bdbd"} Nov 26 16:17:49 crc kubenswrapper[5200]: I1126 16:17:49.384504 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" podStartSLOduration=1.384486643 podStartE2EDuration="1.384486643s" podCreationTimestamp="2025-11-26 16:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 16:17:49.379135131 +0000 UTC m=+10038.058336969" watchObservedRunningTime="2025-11-26 16:17:49.384486643 +0000 UTC m=+10038.063688461" Nov 26 16:17:50 crc kubenswrapper[5200]: I1126 16:17:50.380490 5200 generic.go:358] "Generic (PLEG): container finished" podID="c1b627a0-d86e-433c-b13e-de1b770be0f0" containerID="d2485ed694d4ff156bd3ffe69303f455822a50ed6d97cd8937cd4e87212079e8" exitCode=1 Nov 26 16:17:50 crc kubenswrapper[5200]: I1126 16:17:50.380594 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" event={"ID":"c1b627a0-d86e-433c-b13e-de1b770be0f0","Type":"ContainerDied","Data":"d2485ed694d4ff156bd3ffe69303f455822a50ed6d97cd8937cd4e87212079e8"} Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.520309 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.560435 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pb99x/crc-debug-9p5nl"] Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.573024 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pb99x/crc-debug-9p5nl"] Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.652279 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgvv8\" (UniqueName: \"kubernetes.io/projected/c1b627a0-d86e-433c-b13e-de1b770be0f0-kube-api-access-wgvv8\") pod \"c1b627a0-d86e-433c-b13e-de1b770be0f0\" (UID: \"c1b627a0-d86e-433c-b13e-de1b770be0f0\") " Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.652415 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1b627a0-d86e-433c-b13e-de1b770be0f0-host\") pod \"c1b627a0-d86e-433c-b13e-de1b770be0f0\" (UID: \"c1b627a0-d86e-433c-b13e-de1b770be0f0\") " Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.652489 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1b627a0-d86e-433c-b13e-de1b770be0f0-host" (OuterVolumeSpecName: "host") pod "c1b627a0-d86e-433c-b13e-de1b770be0f0" (UID: "c1b627a0-d86e-433c-b13e-de1b770be0f0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.653056 5200 reconciler_common.go:299] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1b627a0-d86e-433c-b13e-de1b770be0f0-host\") on node \"crc\" DevicePath \"\"" Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.658928 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b627a0-d86e-433c-b13e-de1b770be0f0-kube-api-access-wgvv8" (OuterVolumeSpecName: "kube-api-access-wgvv8") pod "c1b627a0-d86e-433c-b13e-de1b770be0f0" (UID: "c1b627a0-d86e-433c-b13e-de1b770be0f0"). InnerVolumeSpecName "kube-api-access-wgvv8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:17:51 crc kubenswrapper[5200]: I1126 16:17:51.755313 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wgvv8\" (UniqueName: \"kubernetes.io/projected/c1b627a0-d86e-433c-b13e-de1b770be0f0-kube-api-access-wgvv8\") on node \"crc\" DevicePath \"\"" Nov 26 16:17:52 crc kubenswrapper[5200]: I1126 16:17:52.404093 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" Nov 26 16:17:52 crc kubenswrapper[5200]: I1126 16:17:52.404103 5200 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ac60cccdfd00b285735178ad89a6f3ccd0f3a639c6440698afee9764d4bdbd" Nov 26 16:17:52 crc kubenswrapper[5200]: I1126 16:17:52.426646 5200 status_manager.go:895] "Failed to get status for pod" podUID="c1b627a0-d86e-433c-b13e-de1b770be0f0" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" err="pods \"crc-debug-9p5nl\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pb99x\": no relationship found between node 'crc' and this object" Nov 26 16:17:53 crc kubenswrapper[5200]: I1126 16:17:53.015501 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b627a0-d86e-433c-b13e-de1b770be0f0" path="/var/lib/kubelet/pods/c1b627a0-d86e-433c-b13e-de1b770be0f0/volumes" Nov 26 16:17:53 crc kubenswrapper[5200]: I1126 16:17:53.016677 5200 status_manager.go:895] "Failed to get status for pod" podUID="c1b627a0-d86e-433c-b13e-de1b770be0f0" pod="openshift-must-gather-pb99x/crc-debug-9p5nl" err="pods \"crc-debug-9p5nl\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pb99x\": no relationship found between node 'crc' and this object" Nov 26 16:17:54 crc kubenswrapper[5200]: I1126 16:17:54.970080 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:17:54 crc kubenswrapper[5200]: E1126 16:17:54.970872 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:18:09 crc kubenswrapper[5200]: I1126 16:18:09.969717 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:18:09 crc kubenswrapper[5200]: E1126 16:18:09.970559 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:18:24 crc kubenswrapper[5200]: I1126 16:18:24.970568 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:18:24 crc kubenswrapper[5200]: E1126 16:18:24.971832 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:18:29 crc kubenswrapper[5200]: I1126 16:18:29.931666 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cxqmh"] Nov 26 16:18:29 crc kubenswrapper[5200]: I1126 16:18:29.934304 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1b627a0-d86e-433c-b13e-de1b770be0f0" containerName="container-00" Nov 26 16:18:29 crc kubenswrapper[5200]: I1126 16:18:29.934340 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b627a0-d86e-433c-b13e-de1b770be0f0" containerName="container-00" Nov 26 16:18:29 crc kubenswrapper[5200]: I1126 16:18:29.934712 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1b627a0-d86e-433c-b13e-de1b770be0f0" containerName="container-00" Nov 26 16:18:29 crc kubenswrapper[5200]: I1126 16:18:29.943208 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:29 crc kubenswrapper[5200]: I1126 16:18:29.949227 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxqmh"] Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.078116 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-utilities\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.078447 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-catalog-content\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.078659 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97xv\" (UniqueName: \"kubernetes.io/projected/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-kube-api-access-t97xv\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.180789 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-catalog-content\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.180918 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t97xv\" (UniqueName: \"kubernetes.io/projected/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-kube-api-access-t97xv\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.181071 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-utilities\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.181549 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-utilities\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.181542 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-catalog-content\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.202541 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97xv\" (UniqueName: \"kubernetes.io/projected/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-kube-api-access-t97xv\") pod \"redhat-marketplace-cxqmh\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.277424 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.732313 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxqmh"] Nov 26 16:18:30 crc kubenswrapper[5200]: W1126 16:18:30.732396 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0798d031_1c2d_40ce_b9e8_7a7c5eda5525.slice/crio-44c3c8fabb11e776d6a73ab37618d50f0a7c531ef93703eab62249c486700b29 WatchSource:0}: Error finding container 44c3c8fabb11e776d6a73ab37618d50f0a7c531ef93703eab62249c486700b29: Status 404 returned error can't find the container with id 44c3c8fabb11e776d6a73ab37618d50f0a7c531ef93703eab62249c486700b29 Nov 26 16:18:30 crc kubenswrapper[5200]: I1126 16:18:30.854219 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxqmh" event={"ID":"0798d031-1c2d-40ce-b9e8-7a7c5eda5525","Type":"ContainerStarted","Data":"44c3c8fabb11e776d6a73ab37618d50f0a7c531ef93703eab62249c486700b29"} Nov 26 16:18:31 crc kubenswrapper[5200]: I1126 16:18:31.866813 5200 generic.go:358] "Generic (PLEG): container finished" podID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerID="6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401" exitCode=0 Nov 26 16:18:31 crc kubenswrapper[5200]: I1126 16:18:31.866913 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxqmh" event={"ID":"0798d031-1c2d-40ce-b9e8-7a7c5eda5525","Type":"ContainerDied","Data":"6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401"} Nov 26 16:18:33 crc kubenswrapper[5200]: I1126 16:18:33.891304 5200 generic.go:358] "Generic (PLEG): container finished" podID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerID="8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db" exitCode=0 Nov 26 16:18:33 crc kubenswrapper[5200]: I1126 16:18:33.892848 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxqmh" event={"ID":"0798d031-1c2d-40ce-b9e8-7a7c5eda5525","Type":"ContainerDied","Data":"8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db"} Nov 26 16:18:34 crc kubenswrapper[5200]: I1126 16:18:34.906966 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxqmh" event={"ID":"0798d031-1c2d-40ce-b9e8-7a7c5eda5525","Type":"ContainerStarted","Data":"d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57"} Nov 26 16:18:34 crc kubenswrapper[5200]: I1126 16:18:34.926102 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cxqmh" podStartSLOduration=4.706934304 podStartE2EDuration="5.926082423s" podCreationTimestamp="2025-11-26 16:18:29 +0000 UTC" firstStartedPulling="2025-11-26 16:18:31.869265408 +0000 UTC m=+10080.548467226" lastFinishedPulling="2025-11-26 16:18:33.088413517 +0000 UTC m=+10081.767615345" observedRunningTime="2025-11-26 16:18:34.925488557 +0000 UTC m=+10083.604690385" watchObservedRunningTime="2025-11-26 16:18:34.926082423 +0000 UTC m=+10083.605284241" Nov 26 16:18:38 crc kubenswrapper[5200]: I1126 16:18:38.969774 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:18:38 crc kubenswrapper[5200]: E1126 16:18:38.971029 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:18:40 crc kubenswrapper[5200]: I1126 16:18:40.278279 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:40 crc kubenswrapper[5200]: I1126 16:18:40.278606 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:40 crc kubenswrapper[5200]: I1126 16:18:40.336331 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:41 crc kubenswrapper[5200]: I1126 16:18:41.067074 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:41 crc kubenswrapper[5200]: I1126 16:18:41.517752 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxqmh"] Nov 26 16:18:42 crc kubenswrapper[5200]: E1126 16:18:42.628522 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2435428 actualBytes=10240 Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.015391 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cxqmh" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerName="registry-server" containerID="cri-o://d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57" gracePeriod=2 Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.507853 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.618717 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-catalog-content\") pod \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.619213 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t97xv\" (UniqueName: \"kubernetes.io/projected/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-kube-api-access-t97xv\") pod \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.619469 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-utilities\") pod \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\" (UID: \"0798d031-1c2d-40ce-b9e8-7a7c5eda5525\") " Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.620718 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-utilities" (OuterVolumeSpecName: "utilities") pod "0798d031-1c2d-40ce-b9e8-7a7c5eda5525" (UID: "0798d031-1c2d-40ce-b9e8-7a7c5eda5525"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.627052 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-kube-api-access-t97xv" (OuterVolumeSpecName: "kube-api-access-t97xv") pod "0798d031-1c2d-40ce-b9e8-7a7c5eda5525" (UID: "0798d031-1c2d-40ce-b9e8-7a7c5eda5525"). InnerVolumeSpecName "kube-api-access-t97xv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.628299 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0798d031-1c2d-40ce-b9e8-7a7c5eda5525" (UID: "0798d031-1c2d-40ce-b9e8-7a7c5eda5525"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.722132 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.722169 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:18:43 crc kubenswrapper[5200]: I1126 16:18:43.722183 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t97xv\" (UniqueName: \"kubernetes.io/projected/0798d031-1c2d-40ce-b9e8-7a7c5eda5525-kube-api-access-t97xv\") on node \"crc\" DevicePath \"\"" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.029275 5200 generic.go:358] "Generic (PLEG): container finished" podID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerID="d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57" exitCode=0 Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.029361 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxqmh" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.029379 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxqmh" event={"ID":"0798d031-1c2d-40ce-b9e8-7a7c5eda5525","Type":"ContainerDied","Data":"d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57"} Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.029418 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxqmh" event={"ID":"0798d031-1c2d-40ce-b9e8-7a7c5eda5525","Type":"ContainerDied","Data":"44c3c8fabb11e776d6a73ab37618d50f0a7c531ef93703eab62249c486700b29"} Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.029439 5200 scope.go:117] "RemoveContainer" containerID="d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.053952 5200 scope.go:117] "RemoveContainer" containerID="8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.081153 5200 scope.go:117] "RemoveContainer" containerID="6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.082192 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxqmh"] Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.095998 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxqmh"] Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.132159 5200 scope.go:117] "RemoveContainer" containerID="d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57" Nov 26 16:18:44 crc kubenswrapper[5200]: E1126 16:18:44.132648 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57\": container with ID starting with d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57 not found: ID does not exist" containerID="d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.132772 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57"} err="failed to get container status \"d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57\": rpc error: code = NotFound desc = could not find container \"d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57\": container with ID starting with d409d515da0814635379ca9a93882fafd8d43938926a1fa26acf2f552662aa57 not found: ID does not exist" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.132807 5200 scope.go:117] "RemoveContainer" containerID="8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db" Nov 26 16:18:44 crc kubenswrapper[5200]: E1126 16:18:44.133335 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db\": container with ID starting with 8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db not found: ID does not exist" containerID="8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.133405 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db"} err="failed to get container status \"8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db\": rpc error: code = NotFound desc = could not find container \"8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db\": container with ID starting with 8e113b22c7f7cc7abdc9729da3ac3f01baf7470a4f45dda1a3d388f056bd80db not found: ID does not exist" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.133431 5200 scope.go:117] "RemoveContainer" containerID="6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401" Nov 26 16:18:44 crc kubenswrapper[5200]: E1126 16:18:44.133821 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401\": container with ID starting with 6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401 not found: ID does not exist" containerID="6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401" Nov 26 16:18:44 crc kubenswrapper[5200]: I1126 16:18:44.133851 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401"} err="failed to get container status \"6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401\": rpc error: code = NotFound desc = could not find container \"6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401\": container with ID starting with 6261ecebe698897f682d5b9858d8e29b74706010bd573d9737d01e1277f1a401 not found: ID does not exist" Nov 26 16:18:45 crc kubenswrapper[5200]: I1126 16:18:45.001873 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" path="/var/lib/kubelet/pods/0798d031-1c2d-40ce-b9e8-7a7c5eda5525/volumes" Nov 26 16:18:50 crc kubenswrapper[5200]: I1126 16:18:50.970033 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:18:50 crc kubenswrapper[5200]: E1126 16:18:50.971012 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:19:04 crc kubenswrapper[5200]: I1126 16:19:04.987910 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:19:04 crc kubenswrapper[5200]: E1126 16:19:04.988807 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:19:18 crc kubenswrapper[5200]: I1126 16:19:18.970448 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:19:18 crc kubenswrapper[5200]: E1126 16:19:18.971463 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:19:30 crc kubenswrapper[5200]: I1126 16:19:30.970498 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:19:30 crc kubenswrapper[5200]: E1126 16:19:30.971917 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:19:41 crc kubenswrapper[5200]: I1126 16:19:41.969645 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:19:41 crc kubenswrapper[5200]: E1126 16:19:41.970534 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:19:42 crc kubenswrapper[5200]: E1126 16:19:42.526513 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2345393 actualBytes=10240 Nov 26 16:19:55 crc kubenswrapper[5200]: I1126 16:19:55.970110 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:19:55 crc kubenswrapper[5200]: E1126 16:19:55.971013 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:20:09 crc kubenswrapper[5200]: I1126 16:20:09.969581 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:20:09 crc kubenswrapper[5200]: E1126 16:20:09.970512 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:20:23 crc kubenswrapper[5200]: I1126 16:20:23.969461 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:20:23 crc kubenswrapper[5200]: E1126 16:20:23.970439 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:20:36 crc kubenswrapper[5200]: I1126 16:20:36.984833 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:20:36 crc kubenswrapper[5200]: I1126 16:20:36.991143 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 16:20:37 crc kubenswrapper[5200]: I1126 16:20:37.415754 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"3c29e2008f28fd67b4e40ba4747736028cd6efcc421eb390996a19a1d2fecaab"} Nov 26 16:20:40 crc kubenswrapper[5200]: I1126 16:20:40.158158 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:20:40 crc kubenswrapper[5200]: I1126 16:20:40.177948 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:20:40 crc kubenswrapper[5200]: I1126 16:20:40.191125 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:20:40 crc kubenswrapper[5200]: I1126 16:20:40.201420 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:20:42 crc kubenswrapper[5200]: E1126 16:20:42.670905 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2344069 actualBytes=10240 Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.089000 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_12dbd5cb-56ae-4804-bfbf-e23b17e370dc/init-config-reloader/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.257716 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_12dbd5cb-56ae-4804-bfbf-e23b17e370dc/init-config-reloader/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.314482 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_12dbd5cb-56ae-4804-bfbf-e23b17e370dc/alertmanager/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.368308 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_12dbd5cb-56ae-4804-bfbf-e23b17e370dc/config-reloader/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.502387 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_826dcf1f-7136-4b7f-b5c0-66788d96f2df/aodh-api/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.544824 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_826dcf1f-7136-4b7f-b5c0-66788d96f2df/aodh-listener/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.608583 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_826dcf1f-7136-4b7f-b5c0-66788d96f2df/aodh-evaluator/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.739342 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_826dcf1f-7136-4b7f-b5c0-66788d96f2df/aodh-notifier/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.802784 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7859465cfd-8zj58_fd4019f7-8fb5-4355-84d8-f4f76473ccb0/barbican-api/0.log" Nov 26 16:20:52 crc kubenswrapper[5200]: I1126 16:20:52.816198 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7859465cfd-8zj58_fd4019f7-8fb5-4355-84d8-f4f76473ccb0/barbican-api-log/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.004518 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cbf768654-gszst_33fba1e6-7543-4be3-a405-bd2a4fe192a7/barbican-keystone-listener/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.037956 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cbf768654-gszst_33fba1e6-7543-4be3-a405-bd2a4fe192a7/barbican-keystone-listener-log/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.248945 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d76b587dc-65xw5_ac1fd7ae-4ef2-4894-a7c0-ea644c68c908/barbican-worker/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.273754 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d76b587dc-65xw5_ac1fd7ae-4ef2-4894-a7c0-ea644c68c908/barbican-worker-log/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.306914 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-rg4cg_30e6b5f6-86f5-422f-92c2-f862bb5ef259/bootstrap-openstack-openstack-cell1/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.484200 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b7f4347d-c34a-497d-a741-62d81b71a86d/ceilometer-central-agent/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.529140 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b7f4347d-c34a-497d-a741-62d81b71a86d/ceilometer-notification-agent/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.648127 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b7f4347d-c34a-497d-a741-62d81b71a86d/proxy-httpd/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.732141 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b7f4347d-c34a-497d-a741-62d81b71a86d/sg-core/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.819153 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-8x46w_23d9ad7b-8d47-49ce-90b8-698c302994f8/ceph-client-openstack-openstack-cell1/0.log" Nov 26 16:20:53 crc kubenswrapper[5200]: I1126 16:20:53.997722 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9c1a1082-4802-475f-ba21-870d30cb5247/cinder-api-log/0.log" Nov 26 16:20:54 crc kubenswrapper[5200]: I1126 16:20:54.039980 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9c1a1082-4802-475f-ba21-870d30cb5247/cinder-api/0.log" Nov 26 16:20:54 crc kubenswrapper[5200]: I1126 16:20:54.322707 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9ab2371f-4366-4f69-9cdd-562c6581d05a/probe/0.log" Nov 26 16:20:54 crc kubenswrapper[5200]: I1126 16:20:54.349352 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9ab2371f-4366-4f69-9cdd-562c6581d05a/cinder-backup/0.log" Nov 26 16:20:54 crc kubenswrapper[5200]: I1126 16:20:54.505978 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_72e329ae-c511-4991-9788-3c303b795651/cinder-scheduler/0.log" Nov 26 16:20:54 crc kubenswrapper[5200]: I1126 16:20:54.594173 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_72e329ae-c511-4991-9788-3c303b795651/probe/0.log" Nov 26 16:20:54 crc kubenswrapper[5200]: I1126 16:20:54.636520 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_e186001a-92bc-4767-916f-8755715db4b8/cinder-volume/0.log" Nov 26 16:20:54 crc kubenswrapper[5200]: I1126 16:20:54.861620 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-zd96f_e0a6fa0f-bfcf-4475-b38a-7c90139fdc7c/configure-network-openstack-openstack-cell1/0.log" Nov 26 16:20:54 crc kubenswrapper[5200]: I1126 16:20:54.875097 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_e186001a-92bc-4767-916f-8755715db4b8/probe/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.061236 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-rgb5g_603bb6c7-0b64-4c48-ac82-ce369807a22d/configure-os-openstack-openstack-cell1/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.073789 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74dcc4787f-bkzrl_82d441b1-167f-4309-92c2-36abaca7f602/init/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.255787 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74dcc4787f-bkzrl_82d441b1-167f-4309-92c2-36abaca7f602/init/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.276311 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-shd7t_472d7d34-e34d-4b58-9c6a-f290dd6561f0/download-cache-openstack-openstack-cell1/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.289084 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74dcc4787f-bkzrl_82d441b1-167f-4309-92c2-36abaca7f602/dnsmasq-dns/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.487483 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a200f8e7-0290-47f8-8d77-769ebaa23422/glance-httpd/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.493334 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a200f8e7-0290-47f8-8d77-769ebaa23422/glance-log/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.573850 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_aa79542b-64a8-44dc-b8fa-131c110cf1ab/glance-httpd/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.589743 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_aa79542b-64a8-44dc-b8fa-131c110cf1ab/glance-log/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.766132 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7cc8b7d87-td966_90191aa6-cceb-48f3-9af3-ed83fcb00f1a/heat-api/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.933531 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5f6ff6448b-8b8fd_8fd4cbed-3800-4e75-8b5d-8a49a33debe0/heat-cfnapi/0.log" Nov 26 16:20:55 crc kubenswrapper[5200]: I1126 16:20:55.961646 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5896b68b5c-qkmxp_a7e04d58-74fa-4aae-97bc-5d139867905b/heat-engine/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.147123 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-vv4s4_9817c26f-dc8b-449f-ab69-3a627bd97e69/install-certs-openstack-openstack-cell1/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.159124 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78d4d4fccc-2h985_4562c0bf-dfab-46f9-848a-b6819d557283/horizon/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.208364 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78d4d4fccc-2h985_4562c0bf-dfab-46f9-848a-b6819d557283/horizon-log/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.365012 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-kbtn4_8e8a19ae-5836-403f-92cc-d102b1fbe3af/install-os-openstack-openstack-cell1/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.475749 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29402821-wtjlz_23c934e2-7b5a-4567-aac8-b8ffea246b3d/keystone-cron/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.523302 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6f7d99cff4-gn8zk_146a6403-b94d-48dd-8682-827c7dcf966d/keystone-api/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.623928 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29402881-4vvr6_1c323096-3357-46d1-b42c-d5d0f9df9550/keystone-cron/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.678444 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8911506d-17bb-44ea-8510-c5d547f65a28/kube-state-metrics/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.825646 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-m9k4v_b36710e1-d61f-4fcc-9c44-d1298dedac90/libvirt-openstack-openstack-cell1/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.942029 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_50646301-6fce-4190-9a6b-0978c91d58ac/manila-api-log/0.log" Nov 26 16:20:56 crc kubenswrapper[5200]: I1126 16:20:56.945531 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_50646301-6fce-4190-9a6b-0978c91d58ac/manila-api/0.log" Nov 26 16:20:57 crc kubenswrapper[5200]: I1126 16:20:57.064736 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_b9762bc2-03ed-4011-b44e-817143b25ca0/manila-scheduler/0.log" Nov 26 16:20:57 crc kubenswrapper[5200]: I1126 16:20:57.157598 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_b9762bc2-03ed-4011-b44e-817143b25ca0/probe/0.log" Nov 26 16:20:57 crc kubenswrapper[5200]: I1126 16:20:57.220989 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_c810050e-86da-4bea-8606-81b4a0c7d1a4/manila-share/0.log" Nov 26 16:20:57 crc kubenswrapper[5200]: I1126 16:20:57.306530 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_c810050e-86da-4bea-8606-81b4a0c7d1a4/probe/0.log" Nov 26 16:20:57 crc kubenswrapper[5200]: I1126 16:20:57.585708 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d9c6594ff-hl6n8_fc2a4c62-a08e-4442-a3c3-b26e0360b30a/neutron-api/0.log" Nov 26 16:20:57 crc kubenswrapper[5200]: I1126 16:20:57.641570 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d9c6594ff-hl6n8_fc2a4c62-a08e-4442-a3c3-b26e0360b30a/neutron-httpd/0.log" Nov 26 16:20:57 crc kubenswrapper[5200]: I1126 16:20:57.773704 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-5fjf9_1df5f5ba-1852-442d-9f4a-6d79ffa0ae94/neutron-dhcp-openstack-openstack-cell1/0.log" Nov 26 16:20:57 crc kubenswrapper[5200]: I1126 16:20:57.915853 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-r7nwj_56520482-0e09-42f1-be88-5e063facd0d1/neutron-metadata-openstack-openstack-cell1/0.log" Nov 26 16:20:58 crc kubenswrapper[5200]: I1126 16:20:58.098233 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-dk8vz_f2843638-a976-41b5-b129-9186db7b9257/neutron-sriov-openstack-openstack-cell1/0.log" Nov 26 16:20:58 crc kubenswrapper[5200]: I1126 16:20:58.244801 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_deba3181-5f0e-459b-a975-541e140dd515/nova-api-api/0.log" Nov 26 16:20:58 crc kubenswrapper[5200]: I1126 16:20:58.317596 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_deba3181-5f0e-459b-a975-541e140dd515/nova-api-log/0.log" Nov 26 16:20:58 crc kubenswrapper[5200]: I1126 16:20:58.412906 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a3e9d79c-5406-4d25-8ef3-376e4c88c0ea/nova-cell0-conductor-conductor/0.log" Nov 26 16:20:58 crc kubenswrapper[5200]: I1126 16:20:58.649418 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_35b67423-290e-4d22-a345-db88e6c279cd/nova-cell1-conductor-conductor/0.log" Nov 26 16:20:58 crc kubenswrapper[5200]: I1126 16:20:58.825938 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9203d723-6e51-4b7a-a443-3e9983dc3fa1/nova-cell1-novncproxy-novncproxy/0.log" Nov 26 16:20:58 crc kubenswrapper[5200]: I1126 16:20:58.999716 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld9g5n_4308e63d-232d-4d73-a202-5bb6000bac9c/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Nov 26 16:20:59 crc kubenswrapper[5200]: I1126 16:20:59.178394 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-fxsch_047cc627-0c41-454c-a21e-ef04470f6900/nova-cell1-openstack-openstack-cell1/0.log" Nov 26 16:20:59 crc kubenswrapper[5200]: I1126 16:20:59.295978 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fde119d8-8d31-4f78-b9dc-f3415b3a6f67/nova-metadata-log/0.log" Nov 26 16:20:59 crc kubenswrapper[5200]: I1126 16:20:59.295992 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fde119d8-8d31-4f78-b9dc-f3415b3a6f67/nova-metadata-metadata/0.log" Nov 26 16:20:59 crc kubenswrapper[5200]: I1126 16:20:59.517448 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5cc8d89f58-4k72z_97a21845-f322-4fd9-9e6a-2d988ecf13a1/init/0.log" Nov 26 16:20:59 crc kubenswrapper[5200]: I1126 16:20:59.518647 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_aba34874-24d5-4073-8965-f81ac1846ea1/nova-scheduler-scheduler/0.log" Nov 26 16:20:59 crc kubenswrapper[5200]: I1126 16:20:59.710570 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5cc8d89f58-4k72z_97a21845-f322-4fd9-9e6a-2d988ecf13a1/init/0.log" Nov 26 16:20:59 crc kubenswrapper[5200]: I1126 16:20:59.747615 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5cc8d89f58-4k72z_97a21845-f322-4fd9-9e6a-2d988ecf13a1/octavia-api-provider-agent/0.log" Nov 26 16:20:59 crc kubenswrapper[5200]: I1126 16:20:59.956768 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-nnl9w_43065d0b-e6ea-4c72-b512-58c58235ede5/init/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.058762 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-nnl9w_43065d0b-e6ea-4c72-b512-58c58235ede5/init/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.071615 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5cc8d89f58-4k72z_97a21845-f322-4fd9-9e6a-2d988ecf13a1/octavia-api/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.186050 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-nnl9w_43065d0b-e6ea-4c72-b512-58c58235ede5/octavia-healthmanager/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.255621 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-rd8d9_0eff5089-5023-4c2f-930f-1bc252c879a9/init/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.436853 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-rd8d9_0eff5089-5023-4c2f-930f-1bc252c879a9/octavia-housekeeping/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.470851 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-rd8d9_0eff5089-5023-4c2f-930f-1bc252c879a9/init/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.548949 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-fcc7bff87-qqxfk_60d99396-e942-4d33-88ff-f0c5e8b135a0/init/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.715426 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-fcc7bff87-qqxfk_60d99396-e942-4d33-88ff-f0c5e8b135a0/octavia-amphora-httpd/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.720466 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-fcc7bff87-qqxfk_60d99396-e942-4d33-88ff-f0c5e8b135a0/init/0.log" Nov 26 16:21:00 crc kubenswrapper[5200]: I1126 16:21:00.778643 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-k5cn8_f39891c5-2a90-40c2-b035-e19b9b63363a/init/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.023891 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-k5cn8_f39891c5-2a90-40c2-b035-e19b9b63363a/init/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.040209 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-k5cn8_f39891c5-2a90-40c2-b035-e19b9b63363a/octavia-rsyslog/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.100026 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-rwxpq_01edcd7b-aec3-46e4-8638-545742dd0821/init/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.342285 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-rwxpq_01edcd7b-aec3-46e4-8638-545742dd0821/init/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.393110 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5299e9c2-94ad-423b-8b68-790b3554d0e0/mysql-bootstrap/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.476110 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-rwxpq_01edcd7b-aec3-46e4-8638-545742dd0821/octavia-worker/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.668772 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5299e9c2-94ad-423b-8b68-790b3554d0e0/mysql-bootstrap/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.675575 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5299e9c2-94ad-423b-8b68-790b3554d0e0/galera/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.750615 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bb08e317-49b5-4bda-9130-0cb6c5780603/mysql-bootstrap/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.958113 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bb08e317-49b5-4bda-9130-0cb6c5780603/galera/0.log" Nov 26 16:21:01 crc kubenswrapper[5200]: I1126 16:21:01.966754 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bb08e317-49b5-4bda-9130-0cb6c5780603/mysql-bootstrap/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.101483 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_eff47485-b114-4f46-a598-85fd322a1f63/openstackclient/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.182769 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wvc4w_472b8ab3-bd19-4977-bf47-b2336521f28f/openstack-network-exporter/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.345311 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t88m7_9f1d74d7-e2c1-4e67-8d72-785570da51f1/ovsdb-server-init/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.520417 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t88m7_9f1d74d7-e2c1-4e67-8d72-785570da51f1/ovsdb-server-init/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.539940 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t88m7_9f1d74d7-e2c1-4e67-8d72-785570da51f1/ovs-vswitchd/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.575997 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t88m7_9f1d74d7-e2c1-4e67-8d72-785570da51f1/ovsdb-server/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.743179 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vqkcw_b2e3f416-2e99-40ef-b104-c2ef740b5a2b/ovn-controller/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.804279 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d/openstack-network-exporter/0.log" Nov 26 16:21:02 crc kubenswrapper[5200]: I1126 16:21:02.874063 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_659d0a2a-3ae4-4fcf-850e-37e4a6f2df4d/ovn-northd/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.118046 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-62d9h_9c463695-8a19-4649-bbc3-85c2c13225fe/ovn-openstack-openstack-cell1/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.300286 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_05004e62-63f8-47c6-a0ca-28b5f9c8fb08/openstack-network-exporter/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.423068 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_05004e62-63f8-47c6-a0ca-28b5f9c8fb08/ovsdbserver-nb/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.480784 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_36a4832a-90a2-4b28-a12a-0743b9f65bb6/openstack-network-exporter/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.534200 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_36a4832a-90a2-4b28-a12a-0743b9f65bb6/ovsdbserver-nb/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.688952 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_1d667330-6af2-4a54-95da-7322f42da901/openstack-network-exporter/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.769065 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_1d667330-6af2-4a54-95da-7322f42da901/ovsdbserver-nb/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.866680 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0e8f2732-c3fe-4e7e-b3a3-7563d86ca513/openstack-network-exporter/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.932630 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0e8f2732-c3fe-4e7e-b3a3-7563d86ca513/ovsdbserver-sb/0.log" Nov 26 16:21:03 crc kubenswrapper[5200]: I1126 16:21:03.955816 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_b8227c6b-ba6c-4c71-81bd-06d34798fcc5/openstack-network-exporter/0.log" Nov 26 16:21:04 crc kubenswrapper[5200]: I1126 16:21:04.101948 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_b8227c6b-ba6c-4c71-81bd-06d34798fcc5/ovsdbserver-sb/0.log" Nov 26 16:21:04 crc kubenswrapper[5200]: I1126 16:21:04.190119 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_600f88b1-0e72-4a09-a577-dc179d2193e6/openstack-network-exporter/0.log" Nov 26 16:21:04 crc kubenswrapper[5200]: I1126 16:21:04.299987 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_600f88b1-0e72-4a09-a577-dc179d2193e6/ovsdbserver-sb/0.log" Nov 26 16:21:04 crc kubenswrapper[5200]: I1126 16:21:04.528222 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-dd54f8444-drbpt_04b48eb6-187a-4044-abb8-e64b68df8f17/placement-api/0.log" Nov 26 16:21:04 crc kubenswrapper[5200]: I1126 16:21:04.609301 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-dd54f8444-drbpt_04b48eb6-187a-4044-abb8-e64b68df8f17/placement-log/0.log" Nov 26 16:21:04 crc kubenswrapper[5200]: I1126 16:21:04.679884 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cz746q_63d2b0dc-1fe3-4f96-b48c-397b167f509f/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Nov 26 16:21:04 crc kubenswrapper[5200]: I1126 16:21:04.831307 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e0fed987-4381-4dae-b6d3-c0a5f5a5a605/init-config-reloader/0.log" Nov 26 16:21:04 crc kubenswrapper[5200]: I1126 16:21:04.996381 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e0fed987-4381-4dae-b6d3-c0a5f5a5a605/init-config-reloader/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.031257 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e0fed987-4381-4dae-b6d3-c0a5f5a5a605/thanos-sidecar/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.043698 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e0fed987-4381-4dae-b6d3-c0a5f5a5a605/config-reloader/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.069044 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e0fed987-4381-4dae-b6d3-c0a5f5a5a605/prometheus/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.277923 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1dec395e-bd7f-4021-a151-5a86dd782ee2/setup-container/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.399083 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1dec395e-bd7f-4021-a151-5a86dd782ee2/setup-container/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.480121 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1dec395e-bd7f-4021-a151-5a86dd782ee2/rabbitmq/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.490221 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5c6740ee-f526-4bb4-b6c9-ab3304b62709/setup-container/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.686464 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5c6740ee-f526-4bb4-b6c9-ab3304b62709/setup-container/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.789734 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5c6740ee-f526-4bb4-b6c9-ab3304b62709/rabbitmq/0.log" Nov 26 16:21:05 crc kubenswrapper[5200]: I1126 16:21:05.802741 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-2n9fd_4677d455-f7eb-4931-b4ea-b334c9610fac/reboot-os-openstack-openstack-cell1/0.log" Nov 26 16:21:06 crc kubenswrapper[5200]: I1126 16:21:06.036348 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-7wkgk_a9cec933-2b92-4db1-a8a7-5f89ea7e303a/run-os-openstack-openstack-cell1/0.log" Nov 26 16:21:06 crc kubenswrapper[5200]: I1126 16:21:06.061955 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-w5njm_52cd4880-5b1a-47af-9502-5ce41c93a94f/ssh-known-hosts-openstack/0.log" Nov 26 16:21:06 crc kubenswrapper[5200]: I1126 16:21:06.276444 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-4vqb4_70357171-3962-45c5-9472-042e377c456b/telemetry-openstack-openstack-cell1/0.log" Nov 26 16:21:06 crc kubenswrapper[5200]: I1126 16:21:06.335978 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-dxmjv_180e251a-424a-4738-852d-1c2bb96044df/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Nov 26 16:21:06 crc kubenswrapper[5200]: I1126 16:21:06.514229 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-7hwpz_bb94087e-4cfd-41c7-8da0-43fe92ef5d2c/validate-network-openstack-openstack-cell1/0.log" Nov 26 16:21:08 crc kubenswrapper[5200]: I1126 16:21:08.237168 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ca944d31-1c29-4808-aed1-09dcdd775199/memcached/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.111845 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc_f80ea0f0-fe9e-4342-bd0d-424dee43b603/util/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.294257 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc_f80ea0f0-fe9e-4342-bd0d-424dee43b603/util/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.322174 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc_f80ea0f0-fe9e-4342-bd0d-424dee43b603/pull/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.329571 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc_f80ea0f0-fe9e-4342-bd0d-424dee43b603/pull/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.574571 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc_f80ea0f0-fe9e-4342-bd0d-424dee43b603/util/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.661639 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc_f80ea0f0-fe9e-4342-bd0d-424dee43b603/pull/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.717977 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_640e55d0e4a2963e096b308a882571c4a1506306094fa82c470c7388ddmk6gc_f80ea0f0-fe9e-4342-bd0d-424dee43b603/extract/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.801427 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-57ffb97849-48vc9_a7d2842d-c7b4-4d44-9734-b8378d3346a2/kube-rbac-proxy/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.907240 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-57ffb97849-48vc9_a7d2842d-c7b4-4d44-9734-b8378d3346a2/manager/0.log" Nov 26 16:21:27 crc kubenswrapper[5200]: I1126 16:21:27.963769 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79976d8764-8xs2k_db4bbd86-c902-4e34-bb73-14f2049c52bc/kube-rbac-proxy/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.129622 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-664984dfdb-9llqp_784e0ea8-032f-4247-87c8-890fb70038f3/kube-rbac-proxy/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.153988 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79976d8764-8xs2k_db4bbd86-c902-4e34-bb73-14f2049c52bc/manager/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.183604 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-664984dfdb-9llqp_784e0ea8-032f-4247-87c8-890fb70038f3/manager/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.328167 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77c8d6d47b-mv8dt_67d8cd88-e803-4d94-a43e-4e52da2c2baf/kube-rbac-proxy/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.519817 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77c8d6d47b-mv8dt_67d8cd88-e803-4d94-a43e-4e52da2c2baf/manager/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.558626 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7d5b8c9b49-lsh9m_7810ba68-8ec1-4268-99ab-196e7d964387/kube-rbac-proxy/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.600194 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7d5b8c9b49-lsh9m_7810ba68-8ec1-4268-99ab-196e7d964387/manager/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.750303 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6dbbbff649-m9lc9_58c1e92c-4f66-475a-aa1a-119d8a9aaa88/kube-rbac-proxy/0.log" Nov 26 16:21:28 crc kubenswrapper[5200]: I1126 16:21:28.782053 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6dbbbff649-m9lc9_58c1e92c-4f66-475a-aa1a-119d8a9aaa88/manager/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.025328 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9cc5b6bd4-kksm2_e5a61fb8-33a8-4763-84ad-5973feee71a1/kube-rbac-proxy/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.038576 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc6b587b-lkn4w_01344b64-a822-408a-b687-698b296cc8db/kube-rbac-proxy/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.239868 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc6b587b-lkn4w_01344b64-a822-408a-b687-698b296cc8db/manager/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.286762 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7dd76969c9-ldrsq_4f0ee79e-a3de-4744-86a9-ca2a6b81675c/kube-rbac-proxy/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.301888 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9cc5b6bd4-kksm2_e5a61fb8-33a8-4763-84ad-5973feee71a1/manager/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.456261 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-696667c874-6stq7_d27dbce3-5af6-40a8-960c-4f6965dabcad/kube-rbac-proxy/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.580265 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7dd76969c9-ldrsq_4f0ee79e-a3de-4744-86a9-ca2a6b81675c/manager/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.588499 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-696667c874-6stq7_d27dbce3-5af6-40a8-960c-4f6965dabcad/manager/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.697171 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-784656dc9c-9mbvq_3db58e3f-7ff8-42b2-abee-c0b9f7c3d596/kube-rbac-proxy/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.811752 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-784656dc9c-9mbvq_3db58e3f-7ff8-42b2-abee-c0b9f7c3d596/manager/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.925585 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6584fc9d7c-nfb9x_19d0e851-53ed-41ce-8730-3c8a1349afd6/kube-rbac-proxy/0.log" Nov 26 16:21:29 crc kubenswrapper[5200]: I1126 16:21:29.961953 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6584fc9d7c-nfb9x_19d0e851-53ed-41ce-8730-3c8a1349afd6/manager/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.002061 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74f4685-bqfwt_e0557c39-59ea-4478-a6e8-ecaed3a311d5/kube-rbac-proxy/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.211594 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-565f56787b-szbjv_c151e7d7-c8ea-4c75-b933-6ec3d84201fb/kube-rbac-proxy/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.272851 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74f4685-bqfwt_e0557c39-59ea-4478-a6e8-ecaed3a311d5/manager/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.297196 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-565f56787b-szbjv_c151e7d7-c8ea-4c75-b933-6ec3d84201fb/manager/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.354079 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv_8a630dbf-7443-4cdd-b250-73108e8abeba/kube-rbac-proxy/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.397469 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c4bfbcf46h5xfv_8a630dbf-7443-4cdd-b250-73108e8abeba/manager/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.705413 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mmlxp_6e8aad04-af7e-40c5-be4c-5c30c93c067d/registry-server/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.734663 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-798fcdcd4f-4ht44_0b996859-4ff2-4e3f-850c-fa191ea64051/operator/0.log" Nov 26 16:21:30 crc kubenswrapper[5200]: I1126 16:21:30.881428 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-698478755b-twvs8_8870e7ff-5b48-4a97-a71b-659c3ad81431/kube-rbac-proxy/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.007130 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5c76888db5-cfxkz_a32a9a76-fc65-489b-9481-73c9f7ff2870/kube-rbac-proxy/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.081395 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-698478755b-twvs8_8870e7ff-5b48-4a97-a71b-659c3ad81431/manager/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.152185 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5c76888db5-cfxkz_a32a9a76-fc65-489b-9481-73c9f7ff2870/manager/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.248407 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-85d9b55b6-x6kh9_1993ad74-5fa8-4fda-8f96-774585eba2a7/operator/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.344274 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7ddc654f5-4nzdn_7e8a26f8-ad1f-4de6-a3f7-063c7635d25d/kube-rbac-proxy/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.454166 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7ddc654f5-4nzdn_7e8a26f8-ad1f-4de6-a3f7-063c7635d25d/manager/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.545034 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bdb4cd4b-d8x72_3f88a352-bb4b-49cd-9606-de25d1500c22/kube-rbac-proxy/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.668629 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55f4974db-8qmm7_2276e8f6-42e2-462a-92ea-321b00fb4664/kube-rbac-proxy/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.753913 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55f4974db-8qmm7_2276e8f6-42e2-462a-92ea-321b00fb4664/manager/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.761037 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bdb4cd4b-d8x72_3f88a352-bb4b-49cd-9606-de25d1500c22/manager/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.935055 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-89776cb76-99p75_7f7c5b44-fd61-434d-b66f-6050b7dfa67c/manager/0.log" Nov 26 16:21:31 crc kubenswrapper[5200]: I1126 16:21:31.954487 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-89776cb76-99p75_7f7c5b44-fd61-434d-b66f-6050b7dfa67c/kube-rbac-proxy/0.log" Nov 26 16:21:32 crc kubenswrapper[5200]: I1126 16:21:32.721952 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-68d9dc8f85-vzjn8_b4cb36a7-b850-428e-8c2a-ffd54329db04/manager/0.log" Nov 26 16:21:42 crc kubenswrapper[5200]: E1126 16:21:42.645035 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2344060 actualBytes=10240 Nov 26 16:21:49 crc kubenswrapper[5200]: I1126 16:21:49.694717 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-75ffdb6fcd-jkbqq_0583aa03-5210-45af-a727-9dd88a965f3b/control-plane-machine-set-operator/0.log" Nov 26 16:21:49 crc kubenswrapper[5200]: I1126 16:21:49.846245 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-755bb95488-g99bm_4dc74e00-faf0-4e1f-9b82-4a03c34a8928/kube-rbac-proxy/0.log" Nov 26 16:21:49 crc kubenswrapper[5200]: I1126 16:21:49.864491 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-755bb95488-g99bm_4dc74e00-faf0-4e1f-9b82-4a03c34a8928/machine-api-operator/0.log" Nov 26 16:22:01 crc kubenswrapper[5200]: I1126 16:22:01.822105 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858d87f86b-jznlj_89f8a963-9f48-4633-aa2a-a3126cd20edd/cert-manager-controller/0.log" Nov 26 16:22:01 crc kubenswrapper[5200]: I1126 16:22:01.951950 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7dbf76d5c8-sldvg_7dff5877-dace-4d99-84dd-7f19da274fce/cert-manager-cainjector/0.log" Nov 26 16:22:02 crc kubenswrapper[5200]: I1126 16:22:02.019214 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-7894b5b9b4-4z9qr_ecf047b8-f174-41df-9c97-11cd9496a5ee/cert-manager-webhook/0.log" Nov 26 16:22:14 crc kubenswrapper[5200]: I1126 16:22:14.905854 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-8b88fbd7-ffhn9_972422a2-6a6a-4def-bdfb-c620ac76ae3b/nmstate-console-plugin/0.log" Nov 26 16:22:15 crc kubenswrapper[5200]: I1126 16:22:15.094038 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-65tl2_50b5e447-f56c-46cb-a3d7-fc31889a8dd0/nmstate-handler/0.log" Nov 26 16:22:15 crc kubenswrapper[5200]: I1126 16:22:15.117743 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-75c64559db-98h79_55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b/nmstate-metrics/0.log" Nov 26 16:22:15 crc kubenswrapper[5200]: I1126 16:22:15.130722 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-75c64559db-98h79_55baf4f3-f4a6-4a9f-b06e-47c6e2a5e76b/kube-rbac-proxy/0.log" Nov 26 16:22:15 crc kubenswrapper[5200]: I1126 16:22:15.274578 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6757c57d5-xxf6c_8a84f308-5641-49b3-8c43-b5d3256d74b8/nmstate-operator/0.log" Nov 26 16:22:15 crc kubenswrapper[5200]: I1126 16:22:15.326292 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6dccbdf6bd-x6pb5_a74eea18-7f92-4c34-a977-fe0298221149/nmstate-webhook/0.log" Nov 26 16:22:29 crc kubenswrapper[5200]: I1126 16:22:29.684679 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7964959ff7-8q6lh_337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb/kube-rbac-proxy/0.log" Nov 26 16:22:29 crc kubenswrapper[5200]: I1126 16:22:29.964797 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5cd4469944-6sw2r_ddfe9197-d598-4037-953f-b9b764821e16/manager/0.log" Nov 26 16:22:30 crc kubenswrapper[5200]: I1126 16:22:30.087072 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-658b6bb794-ghdtx_d9858b6a-a71b-465f-8c42-0cc7fd754c24/webhook-server/0.log" Nov 26 16:22:30 crc kubenswrapper[5200]: I1126 16:22:30.349230 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gzgvm_4caee2ec-ed09-47e7-859f-5f30447cd56f/kube-rbac-proxy/0.log" Nov 26 16:22:31 crc kubenswrapper[5200]: I1126 16:22:31.065402 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7964959ff7-8q6lh_337bdc2c-c8e3-4584-9f4e-bcd5c8c4ccfb/controller/0.log" Nov 26 16:22:31 crc kubenswrapper[5200]: I1126 16:22:31.304011 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gzgvm_4caee2ec-ed09-47e7-859f-5f30447cd56f/speaker/0.log" Nov 26 16:22:42 crc kubenswrapper[5200]: E1126 16:22:42.938827 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2344056 actualBytes=10240 Nov 26 16:22:43 crc kubenswrapper[5200]: I1126 16:22:43.241872 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8_32c2ca85-2020-4991-8521-ed749516482c/util/0.log" Nov 26 16:22:43 crc kubenswrapper[5200]: I1126 16:22:43.457428 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8_32c2ca85-2020-4991-8521-ed749516482c/pull/0.log" Nov 26 16:22:43 crc kubenswrapper[5200]: I1126 16:22:43.474034 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8_32c2ca85-2020-4991-8521-ed749516482c/util/0.log" Nov 26 16:22:43 crc kubenswrapper[5200]: I1126 16:22:43.517812 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8_32c2ca85-2020-4991-8521-ed749516482c/pull/0.log" Nov 26 16:22:43 crc kubenswrapper[5200]: I1126 16:22:43.672991 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8_32c2ca85-2020-4991-8521-ed749516482c/util/0.log" Nov 26 16:22:43 crc kubenswrapper[5200]: I1126 16:22:43.714508 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8_32c2ca85-2020-4991-8521-ed749516482c/extract/0.log" Nov 26 16:22:43 crc kubenswrapper[5200]: I1126 16:22:43.719950 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a82ck8_32c2ca85-2020-4991-8521-ed749516482c/pull/0.log" Nov 26 16:22:43 crc kubenswrapper[5200]: I1126 16:22:43.884083 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg_40053995-8fb5-4ef3-a6c1-06570eaa42cb/util/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.059636 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg_40053995-8fb5-4ef3-a6c1-06570eaa42cb/pull/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.061650 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg_40053995-8fb5-4ef3-a6c1-06570eaa42cb/util/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.086481 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg_40053995-8fb5-4ef3-a6c1-06570eaa42cb/pull/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.295038 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg_40053995-8fb5-4ef3-a6c1-06570eaa42cb/util/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.295959 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg_40053995-8fb5-4ef3-a6c1-06570eaa42cb/extract/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.300404 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_546b438d40b02de89722967283000c6271fd008e446699f79e0fb4df74vm4hg_40053995-8fb5-4ef3-a6c1-06570eaa42cb/pull/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.446468 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2_798950a8-4762-4d77-9392-44c8323dca2c/util/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.650437 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2_798950a8-4762-4d77-9392-44c8323dca2c/pull/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.664399 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2_798950a8-4762-4d77-9392-44c8323dca2c/util/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.679293 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2_798950a8-4762-4d77-9392-44c8323dca2c/pull/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.880370 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2_798950a8-4762-4d77-9392-44c8323dca2c/extract/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.881817 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2_798950a8-4762-4d77-9392-44c8323dca2c/util/0.log" Nov 26 16:22:44 crc kubenswrapper[5200]: I1126 16:22:44.991429 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210mbrx2_798950a8-4762-4d77-9392-44c8323dca2c/pull/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.110613 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd_fc65da8b-cbfb-4cc8-8a63-a749d7306185/util/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.282681 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd_fc65da8b-cbfb-4cc8-8a63-a749d7306185/util/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.337187 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd_fc65da8b-cbfb-4cc8-8a63-a749d7306185/pull/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.369770 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd_fc65da8b-cbfb-4cc8-8a63-a749d7306185/pull/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.479023 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd_fc65da8b-cbfb-4cc8-8a63-a749d7306185/util/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.489926 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd_fc65da8b-cbfb-4cc8-8a63-a749d7306185/pull/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.552170 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f5ad09c193d240b7b3d19b8eb41490a457b1dc3eccafed6a929f33535lm5pd_fc65da8b-cbfb-4cc8-8a63-a749d7306185/extract/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.691815 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kr7zf_3d65ae68-9754-4ce6-b600-6524b4424230/extract-utilities/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.934866 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kr7zf_3d65ae68-9754-4ce6-b600-6524b4424230/extract-content/0.log" Nov 26 16:22:45 crc kubenswrapper[5200]: I1126 16:22:45.956237 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kr7zf_3d65ae68-9754-4ce6-b600-6524b4424230/extract-utilities/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.110739 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kr7zf_3d65ae68-9754-4ce6-b600-6524b4424230/extract-content/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.231202 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kr7zf_3d65ae68-9754-4ce6-b600-6524b4424230/registry-server/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.315481 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kr7zf_3d65ae68-9754-4ce6-b600-6524b4424230/extract-utilities/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.315659 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kr7zf_3d65ae68-9754-4ce6-b600-6524b4424230/extract-content/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.386703 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wz9wq_12846da6-f887-4fb0-867b-cd706ca8379a/extract-utilities/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.520432 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wz9wq_12846da6-f887-4fb0-867b-cd706ca8379a/extract-utilities/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.561406 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wz9wq_12846da6-f887-4fb0-867b-cd706ca8379a/extract-content/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.567276 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wz9wq_12846da6-f887-4fb0-867b-cd706ca8379a/extract-content/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.706843 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wz9wq_12846da6-f887-4fb0-867b-cd706ca8379a/extract-utilities/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.734858 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wz9wq_12846da6-f887-4fb0-867b-cd706ca8379a/extract-content/0.log" Nov 26 16:22:46 crc kubenswrapper[5200]: I1126 16:22:46.786603 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-547dbd544d-797xh_21cfe3b7-3913-457e-a487-87372a7b9505/marketplace-operator/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.032940 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4f7tr_170ad884-56d8-4fe6-8009-871a11ea9e9a/extract-utilities/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.263607 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4f7tr_170ad884-56d8-4fe6-8009-871a11ea9e9a/extract-content/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.271112 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4f7tr_170ad884-56d8-4fe6-8009-871a11ea9e9a/extract-utilities/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.336319 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4f7tr_170ad884-56d8-4fe6-8009-871a11ea9e9a/extract-content/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.518424 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wz9wq_12846da6-f887-4fb0-867b-cd706ca8379a/registry-server/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.553261 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4f7tr_170ad884-56d8-4fe6-8009-871a11ea9e9a/extract-utilities/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.593956 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4f7tr_170ad884-56d8-4fe6-8009-871a11ea9e9a/extract-content/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.766491 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j2dll_4aa9113c-1a68-4764-ad6f-2c0b804e194b/extract-utilities/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.914635 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4f7tr_170ad884-56d8-4fe6-8009-871a11ea9e9a/registry-server/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.975125 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j2dll_4aa9113c-1a68-4764-ad6f-2c0b804e194b/extract-content/0.log" Nov 26 16:22:47 crc kubenswrapper[5200]: I1126 16:22:47.977283 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j2dll_4aa9113c-1a68-4764-ad6f-2c0b804e194b/extract-utilities/0.log" Nov 26 16:22:48 crc kubenswrapper[5200]: I1126 16:22:48.021439 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j2dll_4aa9113c-1a68-4764-ad6f-2c0b804e194b/extract-content/0.log" Nov 26 16:22:48 crc kubenswrapper[5200]: I1126 16:22:48.210896 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j2dll_4aa9113c-1a68-4764-ad6f-2c0b804e194b/extract-utilities/0.log" Nov 26 16:22:48 crc kubenswrapper[5200]: I1126 16:22:48.242489 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j2dll_4aa9113c-1a68-4764-ad6f-2c0b804e194b/extract-content/0.log" Nov 26 16:22:49 crc kubenswrapper[5200]: I1126 16:22:49.449302 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j2dll_4aa9113c-1a68-4764-ad6f-2c0b804e194b/registry-server/0.log" Nov 26 16:23:00 crc kubenswrapper[5200]: I1126 16:23:00.142432 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-86648f486b-29tzs_a65bbbf4-55fe-482b-acb0-fab23e8c3b8e/prometheus-operator/0.log" Nov 26 16:23:00 crc kubenswrapper[5200]: I1126 16:23:00.239036 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-59dc8d6997-cm8hl_ea8c42ca-7b97-4758-834c-aecd4ea85609/prometheus-operator-admission-webhook/0.log" Nov 26 16:23:00 crc kubenswrapper[5200]: I1126 16:23:00.332114 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-59dc8d6997-cxmt7_8eec4a4d-8f7f-49ce-b87b-54b2f43d6c1b/prometheus-operator-admission-webhook/0.log" Nov 26 16:23:00 crc kubenswrapper[5200]: I1126 16:23:00.455232 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-78c97476f4-dlrhp_a00bd761-94c1-4961-ae82-4e1c2d7b8dae/operator/0.log" Nov 26 16:23:00 crc kubenswrapper[5200]: I1126 16:23:00.570154 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-68bdb49cbf-jxcq6_3fc7ee89-6096-441f-b376-fa628c95f1b5/perses-operator/0.log" Nov 26 16:23:03 crc kubenswrapper[5200]: I1126 16:23:03.154763 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:23:03 crc kubenswrapper[5200]: I1126 16:23:03.156400 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:23:22 crc kubenswrapper[5200]: E1126 16:23:22.121520 5200 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.224:47350->38.102.83.224:42551: write tcp 38.102.83.224:47350->38.102.83.224:42551: write: broken pipe Nov 26 16:23:24 crc kubenswrapper[5200]: E1126 16:23:24.565406 5200 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.224:47480->38.102.83.224:42551: write tcp 38.102.83.224:47480->38.102.83.224:42551: write: broken pipe Nov 26 16:23:33 crc kubenswrapper[5200]: I1126 16:23:33.155006 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:23:33 crc kubenswrapper[5200]: I1126 16:23:33.156176 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:23:36 crc kubenswrapper[5200]: I1126 16:23:36.810049 5200 scope.go:117] "RemoveContainer" containerID="51b884e419e756562236ae78942a7550f41d274e1224658c98bce092eb8f3913" Nov 26 16:23:43 crc kubenswrapper[5200]: E1126 16:23:43.182038 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2344033 actualBytes=10240 Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.154435 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.154961 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.155012 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.155849 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c29e2008f28fd67b4e40ba4747736028cd6efcc421eb390996a19a1d2fecaab"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.155918 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://3c29e2008f28fd67b4e40ba4747736028cd6efcc421eb390996a19a1d2fecaab" gracePeriod=600 Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.733568 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="3c29e2008f28fd67b4e40ba4747736028cd6efcc421eb390996a19a1d2fecaab" exitCode=0 Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.733614 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"3c29e2008f28fd67b4e40ba4747736028cd6efcc421eb390996a19a1d2fecaab"} Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.734397 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerStarted","Data":"8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007"} Nov 26 16:24:03 crc kubenswrapper[5200]: I1126 16:24:03.734429 5200 scope.go:117] "RemoveContainer" containerID="a45b33f884aec18bc2b152b34848340a38cb7e8b3410dc7aac61a3ab27346fea" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.677247 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h55q4"] Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.679452 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerName="registry-server" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.679481 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerName="registry-server" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.679512 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerName="extract-utilities" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.679521 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerName="extract-utilities" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.679607 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerName="extract-content" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.679617 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerName="extract-content" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.679902 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="0798d031-1c2d-40ce-b9e8-7a7c5eda5525" containerName="registry-server" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.757305 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h55q4"] Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.757609 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.862564 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-utilities\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.862852 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x87c7\" (UniqueName: \"kubernetes.io/projected/e8e99236-39c9-4df4-8aa6-660b8c2518ce-kube-api-access-x87c7\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.863072 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-catalog-content\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.965875 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-utilities\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.966148 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87c7\" (UniqueName: \"kubernetes.io/projected/e8e99236-39c9-4df4-8aa6-660b8c2518ce-kube-api-access-x87c7\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.966376 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-catalog-content\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.966388 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-utilities\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.966660 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-catalog-content\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:14 crc kubenswrapper[5200]: I1126 16:24:14.988424 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x87c7\" (UniqueName: \"kubernetes.io/projected/e8e99236-39c9-4df4-8aa6-660b8c2518ce-kube-api-access-x87c7\") pod \"redhat-operators-h55q4\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:15 crc kubenswrapper[5200]: I1126 16:24:15.089004 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:15 crc kubenswrapper[5200]: I1126 16:24:15.607362 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h55q4"] Nov 26 16:24:15 crc kubenswrapper[5200]: I1126 16:24:15.877100 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h55q4" event={"ID":"e8e99236-39c9-4df4-8aa6-660b8c2518ce","Type":"ContainerStarted","Data":"1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35"} Nov 26 16:24:15 crc kubenswrapper[5200]: I1126 16:24:15.878149 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h55q4" event={"ID":"e8e99236-39c9-4df4-8aa6-660b8c2518ce","Type":"ContainerStarted","Data":"0e91c9826164cb44b21ae2acdd3f4ae507b7280a994d14f11127dbe9339420b0"} Nov 26 16:24:16 crc kubenswrapper[5200]: E1126 16:24:16.112786 5200 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8e99236_39c9_4df4_8aa6_660b8c2518ce.slice/crio-1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8e99236_39c9_4df4_8aa6_660b8c2518ce.slice/crio-conmon-1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35.scope\": RecentStats: unable to find data in memory cache]" Nov 26 16:24:16 crc kubenswrapper[5200]: I1126 16:24:16.888317 5200 generic.go:358] "Generic (PLEG): container finished" podID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerID="1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35" exitCode=0 Nov 26 16:24:16 crc kubenswrapper[5200]: I1126 16:24:16.888460 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h55q4" event={"ID":"e8e99236-39c9-4df4-8aa6-660b8c2518ce","Type":"ContainerDied","Data":"1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35"} Nov 26 16:24:17 crc kubenswrapper[5200]: I1126 16:24:17.878440 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ffsrf"] Nov 26 16:24:17 crc kubenswrapper[5200]: I1126 16:24:17.904816 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:17 crc kubenswrapper[5200]: I1126 16:24:17.904883 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ffsrf"] Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.056093 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-utilities\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.056183 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-catalog-content\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.056298 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vtmz\" (UniqueName: \"kubernetes.io/projected/85d70af5-3b7e-47a4-b3ca-7b29cb873852-kube-api-access-5vtmz\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.159238 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vtmz\" (UniqueName: \"kubernetes.io/projected/85d70af5-3b7e-47a4-b3ca-7b29cb873852-kube-api-access-5vtmz\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.160033 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-utilities\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.160115 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-catalog-content\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.160586 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-utilities\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.160611 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-catalog-content\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.183083 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vtmz\" (UniqueName: \"kubernetes.io/projected/85d70af5-3b7e-47a4-b3ca-7b29cb873852-kube-api-access-5vtmz\") pod \"community-operators-ffsrf\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.242498 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:18 crc kubenswrapper[5200]: W1126 16:24:18.774013 5200 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85d70af5_3b7e_47a4_b3ca_7b29cb873852.slice/crio-0273be2883b3d8ada090fd1acf309de0183a9492fb1fc9ec7302e5fed583a14b WatchSource:0}: Error finding container 0273be2883b3d8ada090fd1acf309de0183a9492fb1fc9ec7302e5fed583a14b: Status 404 returned error can't find the container with id 0273be2883b3d8ada090fd1acf309de0183a9492fb1fc9ec7302e5fed583a14b Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.781563 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ffsrf"] Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.927909 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h55q4" event={"ID":"e8e99236-39c9-4df4-8aa6-660b8c2518ce","Type":"ContainerStarted","Data":"5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602"} Nov 26 16:24:18 crc kubenswrapper[5200]: I1126 16:24:18.931404 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffsrf" event={"ID":"85d70af5-3b7e-47a4-b3ca-7b29cb873852","Type":"ContainerStarted","Data":"0273be2883b3d8ada090fd1acf309de0183a9492fb1fc9ec7302e5fed583a14b"} Nov 26 16:24:19 crc kubenswrapper[5200]: I1126 16:24:19.942348 5200 generic.go:358] "Generic (PLEG): container finished" podID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerID="b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724" exitCode=0 Nov 26 16:24:19 crc kubenswrapper[5200]: I1126 16:24:19.942460 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffsrf" event={"ID":"85d70af5-3b7e-47a4-b3ca-7b29cb873852","Type":"ContainerDied","Data":"b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724"} Nov 26 16:24:21 crc kubenswrapper[5200]: I1126 16:24:21.969900 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffsrf" event={"ID":"85d70af5-3b7e-47a4-b3ca-7b29cb873852","Type":"ContainerStarted","Data":"da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3"} Nov 26 16:24:21 crc kubenswrapper[5200]: I1126 16:24:21.979042 5200 generic.go:358] "Generic (PLEG): container finished" podID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerID="5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602" exitCode=0 Nov 26 16:24:21 crc kubenswrapper[5200]: I1126 16:24:21.979447 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h55q4" event={"ID":"e8e99236-39c9-4df4-8aa6-660b8c2518ce","Type":"ContainerDied","Data":"5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602"} Nov 26 16:24:22 crc kubenswrapper[5200]: I1126 16:24:22.994193 5200 generic.go:358] "Generic (PLEG): container finished" podID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerID="da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3" exitCode=0 Nov 26 16:24:22 crc kubenswrapper[5200]: I1126 16:24:22.994241 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffsrf" event={"ID":"85d70af5-3b7e-47a4-b3ca-7b29cb873852","Type":"ContainerDied","Data":"da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3"} Nov 26 16:24:23 crc kubenswrapper[5200]: I1126 16:24:23.012578 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h55q4" event={"ID":"e8e99236-39c9-4df4-8aa6-660b8c2518ce","Type":"ContainerStarted","Data":"dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708"} Nov 26 16:24:23 crc kubenswrapper[5200]: I1126 16:24:23.049664 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h55q4" podStartSLOduration=8.046825741 podStartE2EDuration="9.049643916s" podCreationTimestamp="2025-11-26 16:24:14 +0000 UTC" firstStartedPulling="2025-11-26 16:24:16.889778518 +0000 UTC m=+10425.568980326" lastFinishedPulling="2025-11-26 16:24:17.892596683 +0000 UTC m=+10426.571798501" observedRunningTime="2025-11-26 16:24:23.041241593 +0000 UTC m=+10431.720443431" watchObservedRunningTime="2025-11-26 16:24:23.049643916 +0000 UTC m=+10431.728845734" Nov 26 16:24:24 crc kubenswrapper[5200]: I1126 16:24:24.037327 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffsrf" event={"ID":"85d70af5-3b7e-47a4-b3ca-7b29cb873852","Type":"ContainerStarted","Data":"3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47"} Nov 26 16:24:24 crc kubenswrapper[5200]: I1126 16:24:24.079627 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ffsrf" podStartSLOduration=5.508864807 podStartE2EDuration="7.079598934s" podCreationTimestamp="2025-11-26 16:24:17 +0000 UTC" firstStartedPulling="2025-11-26 16:24:19.944448335 +0000 UTC m=+10428.623650193" lastFinishedPulling="2025-11-26 16:24:21.515182492 +0000 UTC m=+10430.194384320" observedRunningTime="2025-11-26 16:24:24.06328593 +0000 UTC m=+10432.742487778" watchObservedRunningTime="2025-11-26 16:24:24.079598934 +0000 UTC m=+10432.758800762" Nov 26 16:24:25 crc kubenswrapper[5200]: I1126 16:24:25.089550 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:25 crc kubenswrapper[5200]: I1126 16:24:25.090083 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:26 crc kubenswrapper[5200]: I1126 16:24:26.160211 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h55q4" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="registry-server" probeResult="failure" output=< Nov 26 16:24:26 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 16:24:26 crc kubenswrapper[5200]: > Nov 26 16:24:28 crc kubenswrapper[5200]: I1126 16:24:28.242717 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:28 crc kubenswrapper[5200]: I1126 16:24:28.243328 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:28 crc kubenswrapper[5200]: I1126 16:24:28.289244 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:29 crc kubenswrapper[5200]: I1126 16:24:29.145759 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:29 crc kubenswrapper[5200]: I1126 16:24:29.199471 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ffsrf"] Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.141542 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ffsrf" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerName="registry-server" containerID="cri-o://3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47" gracePeriod=2 Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.690535 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.802190 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-catalog-content\") pod \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.802619 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vtmz\" (UniqueName: \"kubernetes.io/projected/85d70af5-3b7e-47a4-b3ca-7b29cb873852-kube-api-access-5vtmz\") pod \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.803111 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-utilities\") pod \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\" (UID: \"85d70af5-3b7e-47a4-b3ca-7b29cb873852\") " Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.804094 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-utilities" (OuterVolumeSpecName: "utilities") pod "85d70af5-3b7e-47a4-b3ca-7b29cb873852" (UID: "85d70af5-3b7e-47a4-b3ca-7b29cb873852"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.804731 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.813079 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d70af5-3b7e-47a4-b3ca-7b29cb873852-kube-api-access-5vtmz" (OuterVolumeSpecName: "kube-api-access-5vtmz") pod "85d70af5-3b7e-47a4-b3ca-7b29cb873852" (UID: "85d70af5-3b7e-47a4-b3ca-7b29cb873852"). InnerVolumeSpecName "kube-api-access-5vtmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.844897 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85d70af5-3b7e-47a4-b3ca-7b29cb873852" (UID: "85d70af5-3b7e-47a4-b3ca-7b29cb873852"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.906448 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vtmz\" (UniqueName: \"kubernetes.io/projected/85d70af5-3b7e-47a4-b3ca-7b29cb873852-kube-api-access-5vtmz\") on node \"crc\" DevicePath \"\"" Nov 26 16:24:31 crc kubenswrapper[5200]: I1126 16:24:31.906784 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d70af5-3b7e-47a4-b3ca-7b29cb873852-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.153763 5200 generic.go:358] "Generic (PLEG): container finished" podID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerID="3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47" exitCode=0 Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.153809 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffsrf" event={"ID":"85d70af5-3b7e-47a4-b3ca-7b29cb873852","Type":"ContainerDied","Data":"3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47"} Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.153872 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffsrf" event={"ID":"85d70af5-3b7e-47a4-b3ca-7b29cb873852","Type":"ContainerDied","Data":"0273be2883b3d8ada090fd1acf309de0183a9492fb1fc9ec7302e5fed583a14b"} Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.153887 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffsrf" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.153897 5200 scope.go:117] "RemoveContainer" containerID="3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.203677 5200 scope.go:117] "RemoveContainer" containerID="da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.206161 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ffsrf"] Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.216165 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ffsrf"] Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.232364 5200 scope.go:117] "RemoveContainer" containerID="b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.280713 5200 scope.go:117] "RemoveContainer" containerID="3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47" Nov 26 16:24:32 crc kubenswrapper[5200]: E1126 16:24:32.281114 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47\": container with ID starting with 3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47 not found: ID does not exist" containerID="3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.281148 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47"} err="failed to get container status \"3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47\": rpc error: code = NotFound desc = could not find container \"3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47\": container with ID starting with 3c21dacacedd50d8e2ce677830c5f04cb883bfa4a1d1c6cc1c320a936a53df47 not found: ID does not exist" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.281167 5200 scope.go:117] "RemoveContainer" containerID="da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3" Nov 26 16:24:32 crc kubenswrapper[5200]: E1126 16:24:32.281402 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3\": container with ID starting with da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3 not found: ID does not exist" containerID="da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.281423 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3"} err="failed to get container status \"da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3\": rpc error: code = NotFound desc = could not find container \"da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3\": container with ID starting with da10b89ea531bcddcef0f06a3083ca39a0eb11222f35ec6cf7d68bdee9365aa3 not found: ID does not exist" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.281435 5200 scope.go:117] "RemoveContainer" containerID="b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724" Nov 26 16:24:32 crc kubenswrapper[5200]: E1126 16:24:32.281771 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724\": container with ID starting with b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724 not found: ID does not exist" containerID="b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.281789 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724"} err="failed to get container status \"b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724\": rpc error: code = NotFound desc = could not find container \"b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724\": container with ID starting with b511e66d971d4cafeccff6cfa6c834b47f869c10d4d4b662a1d80fc20450b724 not found: ID does not exist" Nov 26 16:24:32 crc kubenswrapper[5200]: I1126 16:24:32.986850 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" path="/var/lib/kubelet/pods/85d70af5-3b7e-47a4-b3ca-7b29cb873852/volumes" Nov 26 16:24:36 crc kubenswrapper[5200]: I1126 16:24:36.149073 5200 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h55q4" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="registry-server" probeResult="failure" output=< Nov 26 16:24:36 crc kubenswrapper[5200]: timeout: failed to connect service ":50051" within 1s Nov 26 16:24:36 crc kubenswrapper[5200]: > Nov 26 16:24:36 crc kubenswrapper[5200]: I1126 16:24:36.888925 5200 scope.go:117] "RemoveContainer" containerID="d2485ed694d4ff156bd3ffe69303f455822a50ed6d97cd8937cd4e87212079e8" Nov 26 16:24:42 crc kubenswrapper[5200]: E1126 16:24:42.834176 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2434035 actualBytes=10240 Nov 26 16:24:45 crc kubenswrapper[5200]: I1126 16:24:45.154079 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:45 crc kubenswrapper[5200]: I1126 16:24:45.230934 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:45 crc kubenswrapper[5200]: I1126 16:24:45.875427 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h55q4"] Nov 26 16:24:46 crc kubenswrapper[5200]: I1126 16:24:46.329607 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h55q4" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="registry-server" containerID="cri-o://dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708" gracePeriod=2 Nov 26 16:24:46 crc kubenswrapper[5200]: I1126 16:24:46.868767 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:46 crc kubenswrapper[5200]: I1126 16:24:46.995075 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-catalog-content\") pod \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " Nov 26 16:24:46 crc kubenswrapper[5200]: I1126 16:24:46.995588 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-utilities\") pod \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " Nov 26 16:24:46 crc kubenswrapper[5200]: I1126 16:24:46.995649 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x87c7\" (UniqueName: \"kubernetes.io/projected/e8e99236-39c9-4df4-8aa6-660b8c2518ce-kube-api-access-x87c7\") pod \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\" (UID: \"e8e99236-39c9-4df4-8aa6-660b8c2518ce\") " Nov 26 16:24:46 crc kubenswrapper[5200]: I1126 16:24:46.997412 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-utilities" (OuterVolumeSpecName: "utilities") pod "e8e99236-39c9-4df4-8aa6-660b8c2518ce" (UID: "e8e99236-39c9-4df4-8aa6-660b8c2518ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.012786 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e99236-39c9-4df4-8aa6-660b8c2518ce-kube-api-access-x87c7" (OuterVolumeSpecName: "kube-api-access-x87c7") pod "e8e99236-39c9-4df4-8aa6-660b8c2518ce" (UID: "e8e99236-39c9-4df4-8aa6-660b8c2518ce"). InnerVolumeSpecName "kube-api-access-x87c7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.088020 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8e99236-39c9-4df4-8aa6-660b8c2518ce" (UID: "e8e99236-39c9-4df4-8aa6-660b8c2518ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.098232 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.098264 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x87c7\" (UniqueName: \"kubernetes.io/projected/e8e99236-39c9-4df4-8aa6-660b8c2518ce-kube-api-access-x87c7\") on node \"crc\" DevicePath \"\"" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.098275 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e99236-39c9-4df4-8aa6-660b8c2518ce-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.344013 5200 generic.go:358] "Generic (PLEG): container finished" podID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerID="dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708" exitCode=0 Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.344164 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h55q4" event={"ID":"e8e99236-39c9-4df4-8aa6-660b8c2518ce","Type":"ContainerDied","Data":"dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708"} Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.344185 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h55q4" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.344476 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h55q4" event={"ID":"e8e99236-39c9-4df4-8aa6-660b8c2518ce","Type":"ContainerDied","Data":"0e91c9826164cb44b21ae2acdd3f4ae507b7280a994d14f11127dbe9339420b0"} Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.344510 5200 scope.go:117] "RemoveContainer" containerID="dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.392809 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h55q4"] Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.398451 5200 scope.go:117] "RemoveContainer" containerID="5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.406028 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h55q4"] Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.431419 5200 scope.go:117] "RemoveContainer" containerID="1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.489190 5200 scope.go:117] "RemoveContainer" containerID="dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708" Nov 26 16:24:47 crc kubenswrapper[5200]: E1126 16:24:47.489622 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708\": container with ID starting with dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708 not found: ID does not exist" containerID="dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.489671 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708"} err="failed to get container status \"dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708\": rpc error: code = NotFound desc = could not find container \"dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708\": container with ID starting with dc9bc207e864e8f89308de2a87650d0b962d09d675649d6a4251670d8c19f708 not found: ID does not exist" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.489713 5200 scope.go:117] "RemoveContainer" containerID="5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602" Nov 26 16:24:47 crc kubenswrapper[5200]: E1126 16:24:47.489992 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602\": container with ID starting with 5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602 not found: ID does not exist" containerID="5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.490026 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602"} err="failed to get container status \"5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602\": rpc error: code = NotFound desc = could not find container \"5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602\": container with ID starting with 5dddd5d8942e01723b25d53927c584d839b2e02858b3e1c1ab902cd5ee5a2602 not found: ID does not exist" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.490049 5200 scope.go:117] "RemoveContainer" containerID="1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35" Nov 26 16:24:47 crc kubenswrapper[5200]: E1126 16:24:47.490392 5200 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35\": container with ID starting with 1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35 not found: ID does not exist" containerID="1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35" Nov 26 16:24:47 crc kubenswrapper[5200]: I1126 16:24:47.490424 5200 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35"} err="failed to get container status \"1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35\": rpc error: code = NotFound desc = could not find container \"1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35\": container with ID starting with 1255e4673be980a5089d9ec8c2b0dae761f13fd39ed12e6438b0dc89706a9c35 not found: ID does not exist" Nov 26 16:24:48 crc kubenswrapper[5200]: I1126 16:24:48.995300 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" path="/var/lib/kubelet/pods/e8e99236-39c9-4df4-8aa6-660b8c2518ce/volumes" Nov 26 16:24:58 crc kubenswrapper[5200]: I1126 16:24:58.493580 5200 generic.go:358] "Generic (PLEG): container finished" podID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerID="a17d7a6b9b69255e80476dfd1ac5f4257eb563469de03e0811426dc0ce514a81" exitCode=0 Nov 26 16:24:58 crc kubenswrapper[5200]: I1126 16:24:58.493758 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pb99x/must-gather-w2ntz" event={"ID":"086fc523-d4af-43d0-8c12-b3cf4693071b","Type":"ContainerDied","Data":"a17d7a6b9b69255e80476dfd1ac5f4257eb563469de03e0811426dc0ce514a81"} Nov 26 16:24:58 crc kubenswrapper[5200]: I1126 16:24:58.494969 5200 scope.go:117] "RemoveContainer" containerID="a17d7a6b9b69255e80476dfd1ac5f4257eb563469de03e0811426dc0ce514a81" Nov 26 16:24:58 crc kubenswrapper[5200]: I1126 16:24:58.730607 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pb99x_must-gather-w2ntz_086fc523-d4af-43d0-8c12-b3cf4693071b/gather/0.log" Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.110502 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pb99x/must-gather-w2ntz"] Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.111456 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-must-gather-pb99x/must-gather-w2ntz" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerName="copy" containerID="cri-o://83d8aeb4add5b13b6057b7b44eea17e51da6b9ce1cb30cac19a9fdd17ce7d540" gracePeriod=2 Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.113835 5200 status_manager.go:895] "Failed to get status for pod" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" pod="openshift-must-gather-pb99x/must-gather-w2ntz" err="pods \"must-gather-w2ntz\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pb99x\": no relationship found between node 'crc' and this object" Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.121610 5200 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pb99x/must-gather-w2ntz"] Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.605154 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pb99x_must-gather-w2ntz_086fc523-d4af-43d0-8c12-b3cf4693071b/copy/0.log" Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.609176 5200 generic.go:358] "Generic (PLEG): container finished" podID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerID="83d8aeb4add5b13b6057b7b44eea17e51da6b9ce1cb30cac19a9fdd17ce7d540" exitCode=143 Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.715203 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pb99x_must-gather-w2ntz_086fc523-d4af-43d0-8c12-b3cf4693071b/copy/0.log" Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.715958 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.718569 5200 status_manager.go:895] "Failed to get status for pod" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" pod="openshift-must-gather-pb99x/must-gather-w2ntz" err="pods \"must-gather-w2ntz\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pb99x\": no relationship found between node 'crc' and this object" Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.853602 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/086fc523-d4af-43d0-8c12-b3cf4693071b-must-gather-output\") pod \"086fc523-d4af-43d0-8c12-b3cf4693071b\" (UID: \"086fc523-d4af-43d0-8c12-b3cf4693071b\") " Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.854049 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv4vs\" (UniqueName: \"kubernetes.io/projected/086fc523-d4af-43d0-8c12-b3cf4693071b-kube-api-access-vv4vs\") pod \"086fc523-d4af-43d0-8c12-b3cf4693071b\" (UID: \"086fc523-d4af-43d0-8c12-b3cf4693071b\") " Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.860919 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086fc523-d4af-43d0-8c12-b3cf4693071b-kube-api-access-vv4vs" (OuterVolumeSpecName: "kube-api-access-vv4vs") pod "086fc523-d4af-43d0-8c12-b3cf4693071b" (UID: "086fc523-d4af-43d0-8c12-b3cf4693071b"). InnerVolumeSpecName "kube-api-access-vv4vs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:25:06 crc kubenswrapper[5200]: I1126 16:25:06.956525 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vv4vs\" (UniqueName: \"kubernetes.io/projected/086fc523-d4af-43d0-8c12-b3cf4693071b-kube-api-access-vv4vs\") on node \"crc\" DevicePath \"\"" Nov 26 16:25:07 crc kubenswrapper[5200]: I1126 16:25:07.036590 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086fc523-d4af-43d0-8c12-b3cf4693071b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "086fc523-d4af-43d0-8c12-b3cf4693071b" (UID: "086fc523-d4af-43d0-8c12-b3cf4693071b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:25:07 crc kubenswrapper[5200]: I1126 16:25:07.060477 5200 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/086fc523-d4af-43d0-8c12-b3cf4693071b-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 26 16:25:07 crc kubenswrapper[5200]: I1126 16:25:07.623038 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pb99x_must-gather-w2ntz_086fc523-d4af-43d0-8c12-b3cf4693071b/copy/0.log" Nov 26 16:25:07 crc kubenswrapper[5200]: I1126 16:25:07.623971 5200 scope.go:117] "RemoveContainer" containerID="83d8aeb4add5b13b6057b7b44eea17e51da6b9ce1cb30cac19a9fdd17ce7d540" Nov 26 16:25:07 crc kubenswrapper[5200]: I1126 16:25:07.623967 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pb99x/must-gather-w2ntz" Nov 26 16:25:07 crc kubenswrapper[5200]: I1126 16:25:07.650132 5200 scope.go:117] "RemoveContainer" containerID="a17d7a6b9b69255e80476dfd1ac5f4257eb563469de03e0811426dc0ce514a81" Nov 26 16:25:08 crc kubenswrapper[5200]: I1126 16:25:08.985666 5200 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" path="/var/lib/kubelet/pods/086fc523-d4af-43d0-8c12-b3cf4693071b/volumes" Nov 26 16:25:40 crc kubenswrapper[5200]: I1126 16:25:40.378771 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:25:40 crc kubenswrapper[5200]: I1126 16:25:40.384497 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:25:40 crc kubenswrapper[5200]: I1126 16:25:40.387595 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc24w_1f6a6875-f90c-438b-83a3-0fccba5ff947/kube-multus/0.log" Nov 26 16:25:40 crc kubenswrapper[5200]: I1126 16:25:40.395094 5200 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/1.log" Nov 26 16:25:42 crc kubenswrapper[5200]: E1126 16:25:42.661792 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2337351 actualBytes=10240 Nov 26 16:26:03 crc kubenswrapper[5200]: I1126 16:26:03.154157 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:26:03 crc kubenswrapper[5200]: I1126 16:26:03.155847 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:26:33 crc kubenswrapper[5200]: I1126 16:26:33.154175 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:26:33 crc kubenswrapper[5200]: I1126 16:26:33.154752 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:26:42 crc kubenswrapper[5200]: E1126 16:26:42.994650 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2337350 actualBytes=10240 Nov 26 16:27:03 crc kubenswrapper[5200]: I1126 16:27:03.154936 5200 patch_prober.go:28] interesting pod/machine-config-daemon-d2nf9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 16:27:03 crc kubenswrapper[5200]: I1126 16:27:03.155561 5200 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 16:27:03 crc kubenswrapper[5200]: I1126 16:27:03.155627 5200 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" Nov 26 16:27:03 crc kubenswrapper[5200]: I1126 16:27:03.156862 5200 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007"} pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 16:27:03 crc kubenswrapper[5200]: I1126 16:27:03.157055 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" containerName="machine-config-daemon" containerID="cri-o://8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" gracePeriod=600 Nov 26 16:27:03 crc kubenswrapper[5200]: E1126 16:27:03.308064 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:27:04 crc kubenswrapper[5200]: I1126 16:27:04.268147 5200 generic.go:358] "Generic (PLEG): container finished" podID="a2eae436-2fa1-4b90-8656-d39061de1908" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" exitCode=0 Nov 26 16:27:04 crc kubenswrapper[5200]: I1126 16:27:04.268232 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" event={"ID":"a2eae436-2fa1-4b90-8656-d39061de1908","Type":"ContainerDied","Data":"8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007"} Nov 26 16:27:04 crc kubenswrapper[5200]: I1126 16:27:04.268381 5200 scope.go:117] "RemoveContainer" containerID="3c29e2008f28fd67b4e40ba4747736028cd6efcc421eb390996a19a1d2fecaab" Nov 26 16:27:04 crc kubenswrapper[5200]: I1126 16:27:04.269458 5200 scope.go:117] "RemoveContainer" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" Nov 26 16:27:04 crc kubenswrapper[5200]: E1126 16:27:04.270108 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:27:17 crc kubenswrapper[5200]: I1126 16:27:17.969339 5200 scope.go:117] "RemoveContainer" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" Nov 26 16:27:17 crc kubenswrapper[5200]: E1126 16:27:17.970057 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:27:32 crc kubenswrapper[5200]: I1126 16:27:32.986565 5200 scope.go:117] "RemoveContainer" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" Nov 26 16:27:32 crc kubenswrapper[5200]: E1126 16:27:32.987365 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:27:42 crc kubenswrapper[5200]: E1126 16:27:42.676330 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2337340 actualBytes=10240 Nov 26 16:27:46 crc kubenswrapper[5200]: I1126 16:27:46.970378 5200 scope.go:117] "RemoveContainer" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" Nov 26 16:27:46 crc kubenswrapper[5200]: E1126 16:27:46.971743 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:27:58 crc kubenswrapper[5200]: I1126 16:27:58.969966 5200 scope.go:117] "RemoveContainer" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" Nov 26 16:27:58 crc kubenswrapper[5200]: E1126 16:27:58.970918 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:28:13 crc kubenswrapper[5200]: I1126 16:28:13.971643 5200 scope.go:117] "RemoveContainer" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" Nov 26 16:28:13 crc kubenswrapper[5200]: E1126 16:28:13.972785 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:28:28 crc kubenswrapper[5200]: I1126 16:28:28.969514 5200 scope.go:117] "RemoveContainer" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" Nov 26 16:28:28 crc kubenswrapper[5200]: E1126 16:28:28.970597 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.053606 5200 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c8pdz"] Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056377 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerName="registry-server" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056436 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerName="registry-server" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056486 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerName="extract-content" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056497 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerName="extract-content" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056540 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerName="extract-utilities" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056553 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerName="extract-utilities" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056617 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerName="copy" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056629 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerName="copy" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056640 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="extract-content" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056650 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="extract-content" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056708 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="extract-utilities" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056720 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="extract-utilities" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056761 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerName="gather" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056771 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerName="gather" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056788 5200 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="registry-server" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.056798 5200 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="registry-server" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.057119 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerName="copy" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.057155 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="086fc523-d4af-43d0-8c12-b3cf4693071b" containerName="gather" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.057190 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8e99236-39c9-4df4-8aa6-660b8c2518ce" containerName="registry-server" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.057208 5200 memory_manager.go:356] "RemoveStaleState removing state" podUID="85d70af5-3b7e-47a4-b3ca-7b29cb873852" containerName="registry-server" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.073874 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8pdz"] Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.074005 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.169702 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-catalog-content\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.169797 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdfc\" (UniqueName: \"kubernetes.io/projected/aae630df-4451-482c-9120-d65fe82adfd2-kube-api-access-szdfc\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.170201 5200 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-utilities\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.272426 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-utilities\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.272767 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-catalog-content\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.272816 5200 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szdfc\" (UniqueName: \"kubernetes.io/projected/aae630df-4451-482c-9120-d65fe82adfd2-kube-api-access-szdfc\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.272981 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-utilities\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.273277 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-catalog-content\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.296369 5200 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdfc\" (UniqueName: \"kubernetes.io/projected/aae630df-4451-482c-9120-d65fe82adfd2-kube-api-access-szdfc\") pod \"redhat-marketplace-c8pdz\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.408184 5200 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.916818 5200 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 16:28:33 crc kubenswrapper[5200]: I1126 16:28:33.918511 5200 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8pdz"] Nov 26 16:28:34 crc kubenswrapper[5200]: I1126 16:28:34.557385 5200 generic.go:358] "Generic (PLEG): container finished" podID="aae630df-4451-482c-9120-d65fe82adfd2" containerID="81f4ccd936919f681f1ba4f49ec08e0ea74c50d0cfb0a46852052babce21235f" exitCode=0 Nov 26 16:28:34 crc kubenswrapper[5200]: I1126 16:28:34.558896 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8pdz" event={"ID":"aae630df-4451-482c-9120-d65fe82adfd2","Type":"ContainerDied","Data":"81f4ccd936919f681f1ba4f49ec08e0ea74c50d0cfb0a46852052babce21235f"} Nov 26 16:28:34 crc kubenswrapper[5200]: I1126 16:28:34.558937 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8pdz" event={"ID":"aae630df-4451-482c-9120-d65fe82adfd2","Type":"ContainerStarted","Data":"573bf1038264b273c1730ce67841432cbd934e023cf5d915dbe3d9a6edebc600"} Nov 26 16:28:35 crc kubenswrapper[5200]: I1126 16:28:35.571277 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8pdz" event={"ID":"aae630df-4451-482c-9120-d65fe82adfd2","Type":"ContainerStarted","Data":"1a1c8be920d146207ca764eabafb9e03a458ead101d1d26cb38a710c0ad439e0"} Nov 26 16:28:36 crc kubenswrapper[5200]: I1126 16:28:36.588953 5200 generic.go:358] "Generic (PLEG): container finished" podID="aae630df-4451-482c-9120-d65fe82adfd2" containerID="1a1c8be920d146207ca764eabafb9e03a458ead101d1d26cb38a710c0ad439e0" exitCode=0 Nov 26 16:28:36 crc kubenswrapper[5200]: I1126 16:28:36.589106 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8pdz" event={"ID":"aae630df-4451-482c-9120-d65fe82adfd2","Type":"ContainerDied","Data":"1a1c8be920d146207ca764eabafb9e03a458ead101d1d26cb38a710c0ad439e0"} Nov 26 16:28:37 crc kubenswrapper[5200]: I1126 16:28:37.607804 5200 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8pdz" event={"ID":"aae630df-4451-482c-9120-d65fe82adfd2","Type":"ContainerStarted","Data":"151a083e792ef8d2eba2d1393b3610154c489e24c57e0541d6d88e8475d819e8"} Nov 26 16:28:37 crc kubenswrapper[5200]: I1126 16:28:37.653911 5200 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c8pdz" podStartSLOduration=4.047220414 podStartE2EDuration="4.653885265s" podCreationTimestamp="2025-11-26 16:28:33 +0000 UTC" firstStartedPulling="2025-11-26 16:28:34.562228324 +0000 UTC m=+10683.241430182" lastFinishedPulling="2025-11-26 16:28:35.168893185 +0000 UTC m=+10683.848095033" observedRunningTime="2025-11-26 16:28:37.632285841 +0000 UTC m=+10686.311487699" watchObservedRunningTime="2025-11-26 16:28:37.653885265 +0000 UTC m=+10686.333087123" Nov 26 16:28:39 crc kubenswrapper[5200]: I1126 16:28:39.969435 5200 scope.go:117] "RemoveContainer" containerID="8ce051a5d73f760fe785723b2f9c7f03a9c77e4f7b643c865af760ea3d343007" Nov 26 16:28:39 crc kubenswrapper[5200]: E1126 16:28:39.970510 5200 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d2nf9_openshift-machine-config-operator(a2eae436-2fa1-4b90-8656-d39061de1908)\"" pod="openshift-machine-config-operator/machine-config-daemon-d2nf9" podUID="a2eae436-2fa1-4b90-8656-d39061de1908" Nov 26 16:28:42 crc kubenswrapper[5200]: E1126 16:28:42.686329 5200 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=2344066 actualBytes=10240 Nov 26 16:28:43 crc kubenswrapper[5200]: I1126 16:28:43.408424 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:43 crc kubenswrapper[5200]: I1126 16:28:43.409535 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:43 crc kubenswrapper[5200]: I1126 16:28:43.475788 5200 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:43 crc kubenswrapper[5200]: I1126 16:28:43.744522 5200 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:43 crc kubenswrapper[5200]: I1126 16:28:43.816253 5200 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8pdz"] Nov 26 16:28:45 crc kubenswrapper[5200]: I1126 16:28:45.720942 5200 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c8pdz" podUID="aae630df-4451-482c-9120-d65fe82adfd2" containerName="registry-server" containerID="cri-o://151a083e792ef8d2eba2d1393b3610154c489e24c57e0541d6d88e8475d819e8" gracePeriod=2 Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.303107 5200 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8pdz" Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.355432 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-catalog-content\") pod \"aae630df-4451-482c-9120-d65fe82adfd2\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.355615 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-utilities\") pod \"aae630df-4451-482c-9120-d65fe82adfd2\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.355917 5200 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szdfc\" (UniqueName: \"kubernetes.io/projected/aae630df-4451-482c-9120-d65fe82adfd2-kube-api-access-szdfc\") pod \"aae630df-4451-482c-9120-d65fe82adfd2\" (UID: \"aae630df-4451-482c-9120-d65fe82adfd2\") " Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.356921 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-utilities" (OuterVolumeSpecName: "utilities") pod "aae630df-4451-482c-9120-d65fe82adfd2" (UID: "aae630df-4451-482c-9120-d65fe82adfd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.357287 5200 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.361724 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae630df-4451-482c-9120-d65fe82adfd2-kube-api-access-szdfc" (OuterVolumeSpecName: "kube-api-access-szdfc") pod "aae630df-4451-482c-9120-d65fe82adfd2" (UID: "aae630df-4451-482c-9120-d65fe82adfd2"). InnerVolumeSpecName "kube-api-access-szdfc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.368736 5200 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aae630df-4451-482c-9120-d65fe82adfd2" (UID: "aae630df-4451-482c-9120-d65fe82adfd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.459735 5200 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szdfc\" (UniqueName: \"kubernetes.io/projected/aae630df-4451-482c-9120-d65fe82adfd2-kube-api-access-szdfc\") on node \"crc\" DevicePath \"\"" Nov 26 16:28:46 crc kubenswrapper[5200]: I1126 16:28:46.460049 5200 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae630df-4451-482c-9120-d65fe82adfd2-catalog-content\") on node \"crc\" DevicePath \"\""